var/home/core/zuul-output/0000755000175000017500000000000015157421035014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157432567015510 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000343706115157432475020301 0ustar corecore=5ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf i>"mv?_eGbuu񯷑7+%f?7ݭ7֫}% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{3>.dQQ.aLk~g\UlxDJfw6xi1U2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2i]$HU^L_cZ_:F9TJJ{,mvgL;: ԓ$a;ɾ7lַ;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_jiB8.^s?Hs,&,#zd4XBu!.F"`a"BD) ᧁQZ-D\h]Q!]Z8HGU=y&|'oZƧe7ΣԟRxxXԨkJ[8 ";ЗH F=y܇sθm@%*'9qvD]9X&;cɻs0I٘]_fy tt('/V/TB/ap+V9g%$P[4D2L'1bЛ]\s΍ic-ܕ4+ޥ^.w[A9/vb֜}>| TXNrdTs>RDPhإek-*듌D[5l2_nH[׫yTNʹ<ws~^B.Ǔg'AS'E`hmsJU # DuT%ZPt_WďPv`9 C|mRj)CMitmu׀sP.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%Jv`ʥVЇsfjҠƞo6xd~?6^oS5!90n݌ mr"/QI&doLp4+CN(44iVz- 1 EaE nQ Ӌ_kckh>F L+ *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n, y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL {g6R/wD_tՄ.F+HP'AE; J j"b~+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?he:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$][sȎ+*K(l^SM왌ĉwt&SS)Wݔ-?Iɒce(V<ĺ@.dM`8w=R"0l~ ,UIEz+< U*D)E\i,2橴CS-qRD%dk=*Fˠ)z}lte=g<*jDUaj/T 3.)]RbtOlɮ,#meFF ^zli,lÍ+ݗŕcV$]~獀zن"f,B=2 ?ڳ6Lt ߵC& 5sHX7fZ6o }m0y\z9vޫ5˶}7d0*4/_nsayAQ4 D40O#Odt;]f!%c^|zAzk4|A: ; .R C;J%C|JöKׇ4t!tG{P! =_<ٷ( <"|wR^LkC%&PȐ;*-|2i_i]Υ_2{tó^;O<*㰬ZPX pc M#2ܟB!t49y\[w՗>Xo8<tӵ])DS;p[lq( ˏ |.L{P!dqs9oKV㨞8z3L {we,epՈ%,T""IXŭ^'*|0jm=bch3x9ZQ]TY'Fd#--#4cF`~$Y# |ϰm@Qh;h'$x͐~DAuNד\.PD$y2 U%/ 0Q7y)od,A/aHVi%(oJMfcP9(2 YS)(V9yeQ\@aBu1Ue wޅ?ZxghjZgu vr v Jqga\?%`x!݇g Ŋg"oF5%<<؞l}c˴|W>gk0>RvpJ "~dA#zF0KʗrƕvީP tKy;f螹.r#<25̳]wG}A}͕흁wM`:Oơ?*ɹ8ƨ%alw4I)jN7NiPv 6m._x O)JBeo?B fc hPHqrG&Ao,ʧi1l烬Ai}rcx6l*2%]rz}rt~v=?IF>ŧZ/0,豻Dl[p0\4x$O\W'4k!b97uNxMXLoS`/p?즒dM}w{y>/u:^5 +;߂/`ټ>f2'Pilۤ_nD<;=!a0.f)ZNJ3e2.2. eM63 rX`ȪuPv_'. v %bӣ$˴w(>^wh\,@A`09e' unMc$ECo]74E5iN׻w;AK ֜Υ|v+b2m|SZq_Ӆq\ &>4>ٯ鴜`M~pEB(6wo6Ob%\+*̸u ?72P cw~2o)paTrvk^~x+[Aɋv#55aS-aULxOFp HPIMFY A& 3m= X/]̜P↯Od]>g:hʐd,J?i g-lG$BxA^|Nn8x{^e]y=}Vt 7gU=NL1 l 3͛ր0ȫ5oe2T\ Hnx\Ie0359☞6Rx91ʖ[Laô,⋁-T-*FfSDsatZ`J rr6XoD]5QXrT6-`$Ɓf {;nmh(lEJbjuرU?Yȡ%.ۤnҩ+ocp 76LglGe[mkWx.y"jwZ_= W8.aNjKVwtZю$=s|c{M!x-]KjvlyU!hp`|'tuWƂι`/ mukhZbƺX\w5f̶|rir?˼HV"_M#7#.zV0.D>erUfݕ71֐ˌm 25{լ$TB*ʼaL! 6tnLė*2xtH˳vmm"I[E␵\u- V\& @ݒZ~Q!v0輭Ё%lKz]ʖHW[OIT7W:JP3Yl D Z6whtMSEFO.u.rVtӎÓg*,B9 -WI=ʾOf, R6h@,s; W>țzг2dit,:/4&IHy~磻.c> {D޳ٌv?}CJP*X"^aa,aSm|_u(N;˾DhK,rgZĭgD=3_M7WrhYjrEWFHRynQ櫶%!=е2~.U㑣UZF^3?FAXi-}UNi꘩6jV]sm۸Ұ6 zl*7a½E3me Ba{9g31P;r'4|>`(/3<}J~(k 6Mc67mv B3R{2Joё +;MK ۴7ɏڡ -IS{(2JW㽲})W&  pݰgݰm-I(3֜s/<&o(3,t<{T07=yh[IP꽶4}vC6h-OxҜfyҴ1J:0FH1go|{Z$Oee{Qk}lJR:/{G᷶ ֌fj2vK۷a7R\Ѷ1eވ 0Z,}A:JW^!b<Dfl?f2859q-z,kwgϲ&OWzaaX"Z`ߺb}M[]iiޖ㸸o-<>gׇӃ?0VhGXa^ O^3h&(|s39`"#gŧ/gݎ[ 3f@a@))x/;HκaE(9E(a7ΔJhwˈzuv˟ NwM]r@5_<008^3A.II_@f`A/MxyW#gMcnw؅'&1}nئ\5p{&; vnGC8nG(wC;D0a'\2{gYÀ* b:b,뀓7@ G婆.94}Ոq?@&p!) .CZ!"^ianWfwſ t*Z  1 NǝߍqšߋWuP\b\`,@y/qccOJE%#®,>Ε^RId t_84AfK΂pn6Kҹ:ct,]5J,7pU8w; hQ;tv#sa?OvWˎy<GPY ܀_C7wrX.ѓA58-tPx>5wE^WuO 4gud^#GLj>5y.'9l1@vS0;:zTv֧]>Clwbt:Tu뢗.6pCKlפ,T]Au].*tءalplzhy]Ueei\H9"ˌd7\K ɢ U\ pⱊDz(ev3͡wGĦߣvv "ώ('"2 J:`Bjð&&7a6ADRM*`^zݤ|O {.KB  S ddUX+M37{5{0*Oe0Mz0[δ 5C2*\\fcvU票^<%ܦFT:Χ!tl#0+y<3 m59< ?q1#m2Eo5=/ E(dduG~R &5|)qxK-<}  pGb9CjdyD(EJ3TΠ10;}.gi sTYҟɢ#uQN…v;}Rk,_pxTbY yАn}4ߠ1ix{O('O#4S ߬ ]Ņz$Ŭ\'+hTg$[&|:t$Q]oR`6dߪoa8I z^3j~"{+Q}CرqKl?~jN;03 A] UӳY=1D$+Jz$(!C6"] 7+M.7+Xw-? >& V +Y-n}rVp^},F˃Y,]}LYt6ZC{m `׳D_-?hlŒGO>iY0đ"C4^O.okX^:;Op:#ֻv6lRVnM`[;]؁.7?݊zqѝL"큃lW'5 $U_d'HEO4 |! wmjf(O\wļ!gNeC*,!gb")0%I^Y>?) r;ͼ'06{<`ތ, hM4(ͫ S^8nDZFs[K#On¯H\)9%C4 Kka>aJ9MYߗx*<;pi8ڧm77yzvɖpq #E^T^=ش&ӥ (d4D~ʫ=a7Jy*C9-a+q (mWuq'\U!.aO 7`[OC>*G X;\nQ%l*O+R`AN:JMIB^U1)- 6m.({lKA&kRI-=Uٶ6j QnӴOmۭwֵ7\Vi[Jjo ,:ORPgAeA u7}- m @PYPA&˂l.h4A- 6/ 7o.(|KA˂n.h4A- KlgO$%,9)$S-_<,ĭ΋|8ՃU^F(Oe9DGH-}$ Hv%x`U5髥Ms%9$+ױ6q3O{.@v?n羋YYnҟBk`ZuF~͋ʼ?#.ЅͪQQQ Ab?N+ѕ WLZL}O|, -x))L2`(y>??Y'U^"y2AjqO -tEnkASFM02ROD^u)\f χhEarg0 -"&};Ϸ" pZ{r="@qd_K rVq uY2|*\2c0~ %AvՂK 6AyςQxx$UStݿ^D*h'U~blNc Ȟ`Cce靄`Fc50U+Iֿf6~;]4q3lp㊰j{%3YXs--3Uάb&I8HV! Dwf:DS5K t(]W [p/~b;y wX2.k{|iSbʲلa4n]bpKV1LVQN/A>`$b0J*Hada%qlF6"W(?we#}ܳy0FcߣJ5B hO&jCEr-fRE2?L޼~ߴ+VoY/L^ӻyl+W4Uzt7AeŎVjO/4B9]^yia!4{_n! >Nӑ`ObQrx$LcZxUwmwqceuӓO1Cg HƐ@Uٴ4Bdk3FtYֳм i:1bT wΒh/x]1N C8iy]\UxfQb!Cj/;U+t n+NvE/4qّNC [lQ2. _fʴVbgM8Cĭv"eQ}4p6Ri Jp^2$^ Ut`閿#(+j,|V{-* 7T]ۊo͋&ZU˴>mR=v1=jƣs8_eKX挴lT":b6u4j Dn +ףϘwC#TNK F?FA9rt;q }m:Uj#, Ɩ NiY9?yO=ˡ'C?'j!NXD X1q[>Y!62>jeq-V9oGZpx%S*1cI%a2tDb)0蔈oL*lм$n3Xʲ!#fm*0K2FO;u 9冁#r]}~| }aǗm4s@FY%^U*jޭIbQ |V T$%:TCh۬)RT䰦RwGJA!Rw_3RY.::YIzb'%8(BQXPռrhܳy!!}t)U# լYp\dge '|hE`tIQќ2n3b;㝶]T`IQ;>S{re&@X$YWkzqGK'xsGoXn9=} $U$n=XAgG`Qf0"/z 5g<@DF9SU308:91}ڃI)B.@%l.Ҽzmr.Fd1_y~[t&T#G1s p]88]rY+h"ar,z1U9X:q/no f2E#`E`rQ\5l=' GoYpp~3n0XsV)[ECF&F$l c7Zg(`n`Ă`pל>[s~r`#dʺV;˨rTUE!X̰S;&DM[wYa2l)W-g X> a΅*#G1%MK'x&+t,Z+tIe6೉UJɄ`-9Rm42 41:'` wlV[X2 i)T6BD.[kcM([lHĘP rn=* ؔ@{eu[M,>F'gEQ~n܂k5UAP,k,8䌤eoxhLL` O?,1i%~fM()jPB˽2$hJ]b`?LgI2$[o8F9HF90WUc!(Qa8gR[݉ZDi0C#?FIS~ vm 'g^{TIdy5~ݘ<~]ЄT3 ̓@&# -etJD N#KOM j ɼ5v\;Ax̽U,8w9WS d;Kc?`/-V^>Hq_o?E%8Rq0762`% 6SYn%|h7ҡUP{PN~ύiYJȸW^l^uf!`;ĉX&۳yԶM|6t܃(upɘ%P(Gc.7ďR8R,}đNDpτV%W;U'vUxP<`J_̕cD'wEf1zשBKPW:PkHa΍mvJAGCPGm)RR`3{,CGhe}) OD]_y☓WB }͑Q<97qYA%~e] f =1Lxc䔂:* xC@!gXbmoC(,8ZqQ$U=9p`b2@/jo1TF;F/w,5X"h#E[#h~ow蔹(= Y0[˂7HS$>Ghd T=jL(֢4)N+yT8w0)O$ۯO:؜+QhԜ\xمS^B !ZgII=N&+E%=ڃC:[=ɟWi41<<ĉvvL '=>3 36IKo,Fp1l5*YR+)G(XE]ѥ2xjvrꤹ2ӛ .wSjFLJ}?˸ʁf}Zkb~v1"I!%q`_" <:6sŽ䇇X:[k/5SuIY^ K"%ZTg WAZכG̫R,2.Udm27$2fWD6D BX#im?,HxQ SV`ΘJ/vJ2+"jB+:P=Kq"bT7n*sWY<|O#VϫOHy5i 8KW|,uǰ1# VK-ĆP?Ԙhmqۧ4 Oiiʇ5_oVo$;R&fdC;VPZqaSsC U/CI!tM&Jք%>*T y+KUSD0-*,sKpB%IHNoovI!gJHKD o7ܱ5rVwc)O5L +vn96#&Wj o=q1R31g,_8pέ-(ձ?^QѲ^%5F滀{u5;T\K6$ -adJC7UbƠחgqﹲ%xY).!K'+Uo:"HRobm^r(M P!s0fÃ҅"ҔbYsg'w$]Ԣ\a[-KksQ CGPI|+?KVylƌCIE (NU~eqrerU_@3<]2^#%]aJ&8/&)% n2puƈ-: *%bLe',xzaqըק~Xǂ<'gU>$(dŠy').k(z$Kii t/HòH,פB5O|(NGg'[U@AgǴWL&фTDTGb^Zp˖[KNi F8 ٻ9:d#{zg6k^Hڊi2N+C,RZF_"3Eє,ͻӏU%Τ GaCYp*{B*n3Jɋ4L MnL.jS!c7,8jyjl#ެ\jg#Q_T"x]|~rg~n͐!7 xjɦIŸnf5Zk>θk>L/1/Y?ܬXr1:\&fP94B]T -(RD0EE0G&B bs3-)+t,ZБf#%lbr[L9CzKۯv]j"IWdpy,A3۠K% 6NUXKKR;҄Ͱ+EI|#{/bf2=,2e] Q12ZT ]$Ӊ=r{5.uLJ0v`ݹSGKL|ܾlYȰ#-plH:?_YpeRysv 3}Bָ.YQҰ7Yipe&v~ro¿xCyO=N!U͂d![k1|0o^NCmGIzJs: ;)ku;J;v^bMS/}W/c۝{7Å,9Z"7seLOݝsh[,yJ'jP& }1_ Q_s_ɫp/ǔL_Xpzޗ1Ƙ)J3gF󁄀JBPM4!Nv*j˿幓61^b03yƳ//j!qH$V6p w =wd0ڏAuxt69U"#X:9[),8N<.q$y͂KnGFLP g "B-O,JX?7U;rRY.& Y@.s^벉:Õ@rع*ull\*&EmZ ;<:%R cT(`'U*55\N@O` T&wܕ:eiCLqͮۻ攗zH }2'Iun t=su:'®=c(- 5Z ;ȜmjFb5i6|ɳe( Og{'+[OcoF DW{iHzh=lP4]ȹuCRD>f=Ib(' ۺ}e\V? N WΉlUn4: (3/ɔ\…:EmIbn)*uaBNlJRXbWVRXhRN17Ծ9$2|@kE%UޗQ\`ww$c NcGSG-wY& bF{fFz0z%(u3HɢnlC.O:Q%lK۴B7Gps+_eAA_{׬kQN㠌 T~7!z|[OKmbߓ/c6 {T*O< A5{?ÄI◠!|xTX;(1㱱'ǿ)= 2 h'WyֳΦx>scptL >Y~1fV~z|//ΛHN*S°q>F!\:@,; ߬'+WD͹ҏH卺' 7?ľ/83`<3Bga3eQ ?<qՏehIvi"OFc4ֻH$l;ޕDD.$_,Z:3`ͧ'ul&JS@T~Bpx[+= =s{#= ^ߦtԳ?1yn { 86&w~}r۟{U|M|}w<>aOI„I/o:e;UI徾ÓlyGI0ɻ54 EY>N R'w='' $=_.`/HӾ?)6M6>}6d8Yi`$c{Pu9V T4~uܑW@\:6qOKޖaZR5nC{0q?ќ*z!C' .isڍA،Ypx~KJ'e 1?gox \ga\&r_/bl}wkJR3\[q+?g.}31vOsOj'|^cE3gR%P2g ^˷4s%+3)(ҴJK5ǝRZRݝtVra*$7YH bR;_r}aZ Y&oLv%8guCmHYT͂Vk=7 :rPbHV;$u$]$%;< HuBaQAH-mJvsEV.)T9̯f4ȯ(+iAUlSv(~mP: \wID帿'DH.FZ%R#;0 5g6C=HkaNB!*X()ɧ`| Zh7OY>οO>g~2򱔥S@iTX)@XQ8*=)r8a7,>Y!tӠEh{(I݋+ qճF[,Zٞͯ@_+on"ʧHJR̗z;EQOta8E\CoVO~KukRm)1 `@/^5m!6> aZ88 #h}ΌD* 3ꐳHsJM3[(Θ m><&OS(BL6աGynf'@SԦ\M{u Ʃ|k{lҏW [T DDO'թ;\>PNtc}V3q$6>71S ǻeAٰȏ9"GcxאUenT(JyH _D$1FB]L!28|q` _#>܇U+Үxuo[@(#c pSk߇ZE/cKkRŚVzqxu` 9e{ԎG%3-!ULj"uzK1b立X}UG\ўp%-! OoQ[nNPnoN10ZJ v;/u07foB(iu5decնck)8Smy4eEiy0ӛKSnU!e8g(Q¦xrs@RoR/9fXqp4R;͘dR/DEx4'! :vT0ogcxNzqgy}\h#^ٷfl*[jj+MލK,Ö㠈 H KI%f&lM;jM;fLqk4cSU l .A+FB9分xX̡:X9IkxyжѴFѴ!5ьMK 1# `Nb4)%( {Y`Ȳ!SRfZJ[5& ;fhؐjFVh4(O]<()@aޚ4f4a6A8pdf#FTI?F9ҡ_X$Jw%rZQǯ-dXW pBz)bG88<67~@O,.iI`%&C|߽0g7@FqGq$s7&q-B>gW+I\tm."TSH"9oGPH=v *rYD]'I)VJ Ɏ+:|uVQ$7/"$#~+w~'$*@/|!a6z2BxȂ]W˺z{l!p ^8yMa2S`z<&^Nt!:OܬӍ[zb Oy*Mi=}i&D1 &nJlhBc@߆<65L5O{b"T@4*ٱmB䦕­tZ )ɪڑe27}絜oV__&&&ɳ"@FÛR 0;m i,S_FbsӖO ThU*O>-+S}+H?ܧ *!$elsi8U8@ VhS"zXT1-}0Zol&j\Neγj1_`'4i($uYkgĕWmwSƋ|g0DF$pe\pk1)jUZI*a .wwLnkMJ}#քn|s\1U ,Dzo)XP: !!m h$t7ƥET2x'yr̓v6sS J/uʃ4 ,S;2Ȓ"хEs/PvPi";|J !SZ0散u0³KtI @) >`LqO+ϝIah/5 nO!Y..%}VXNeb@`!OaݡLdիz&ZReSKb2,hz̃?)sX))k@~iYLw.Mi[SG Z =IJg!8|iC@cg"\ybeH`L^*a7i*K,8~F3'Pjb8W*\QA P:(Y= _":d%FQ]*EH{6֍| ӷ+jA{՞PsagAC=猰΂i7Wnc9HX  ' QEA`@@ IQ0: J"pZ8H t֜RMA:@`) Q \tZ-'ֹ]sŕ$J$8$j9WLhA,r ^^S[> _0Lh3E"%0C52{w=s0#ދNkrq(k)%דd?O]0 f6{?B!tS_a (E@ɂn_ $R#4zr2ҋ{sיÖ;G❛|#vlxX\duIKnTwߍQ|\3ns@ Muh-h-Rˀ3΄&,%4!$6 5miYQԜ$P\Vs0CJ VGq2K2 tfG&b iUH“Hi%f">GZAWR7u#4<8iOIAO$r|F^w{sv?{FĢ A?H _m2.Ùj2M кr+if>05ol0f*JJCK1̋~s{y^]?6!=bQ1%v a/P>H0ƊTD7L G)hN!AR~L1b'iO nhd ȏK4̊#)## gvp؆ ʰhHB-&Qs) m R"9Me,,%8$Jjd, rJl%Q KS3x(irFDP1KI4Wւ3ZIqrE\D1}1"E3V gQ`Lc<|W4$9{Kg @ !(8mU?R4#?PDBD&1e?,I5R"b=ErnJvI j kX_n UA@N&.奕 jluH+D1K5䢢b@zκ|R ZA+Q7ĭrUJ ??(Hx)Ae_1!]F $! 2(0H!b3'KZ!bryV15J+5M ԝBVVUZ˓2 B֕+0DL͡~ \{@q)4SNXߜ}U S?{ԴbJ]!@k?c }&͚ FCfyBUV T][!bH>iVu0qlʤ5QgS֤be7Ӿɛoz*[ͯHvukHD[g~|8 wЦ1 &}ߋ "b&5AJʲ'^ UKAXp^ų09 =P\Y3*Vtkr zw=Tm~m?ޢWuͼY zKQ8|}"M*tһŮWG7zlrۻ^Ǝy#§-{)P >_4%~YE%159~f%NyF}p}7'?mS\Yݳl|f9ٕ]Jp m,F:By5xAuetVEk@)A>g"TJ'cy\]:z6XLrB/Z>R1/xb1(͟( {͐qP(|.?D[wgB/AݟHl'̢Q#JpvhǓ-ㆅ?@m^CW78mͲǦC9ؗ/)#Nj彌ۗ{3:l}!agOsg\yLRW֠+kre Qp3]U x m:8g>z;KD;)ׅvvb7 5M>l0T}<@JZHݏ/۵'2=K vW ?¹9=7p⏗nL'8+24GGuV`)FZjrWa3М(P,DL2B6D($.9_iPDO>u_w;P;ڐbz2s~Jؾ}Kw?W w:S0S}<@孠:!Vfܡ =:;bCj%u&ձ f 1-g> S]][wTx@#]([T<$w t6ݦsiۤen3Ӕ05+tA! nwm)sEΔBt )ha ^o~0izfo6\*_~>"odƲ,78eO?1%Hί1{r=J="sYtW_gfCgZ1KE֕ 9HLNy;r/ϫT t-_ r/둅xJCi'H9P-R 4HƲ8 -?omuJNNCDk[J+jdž74_Oku_"*yȦt5EUۛmiyy^ P6$Ϋ݆> URm301_K~R< 7ao6LEGcH^/jMu%Yƫ{ xu@h)?L( {&f1~[䦢ˤv[E.+At%OR ʓ63bK)HVΝ-Vvމ;9)':m\B&zT*]kј]/wI TVCakՁzs_HkA:j)@AkA// Od\ˊ9@e=*I,ML/3sT߁@?3.?{̻\I[rLHjW]>.CBa_϶eO%2''C߆wH*ɅM4)o# +!:H*u[:^$I =P7QشC:.=/FbJqdրS]8hU-nn&ooR%& dƒJFƳ<;`߳C$ " ḋ,nk]wݢr'TF7\nyX*dPKf@[l6h05Q`%("J`pb]3`-B ]R;ަ^\lA @U:&^n40)`*^F!>@vͣQ RE9 hoG{J{ 6TЏ1 PEPގ6*lEoeL"䤣mU\QZR&M Hގ6sʔP9 URtMV1"bhoG{G{s*y)HM Zvm.AlRՙ"UD*xIXozzъT1&ۄRHoGzHz%%ԓVyRiqr^,Th FC2Fގ6>h-I`*XoY/(+|nh{@VoG{K{5[S 4hljUJnlm$ՂiKaj-)T&)[eh.8zٻl7PGƗx6\kEqzoz9\0I +XFTT&HIBDSHMR|l|h]M'1 vv?eR `O\ǂp):JMy RʌbV0g5wT'DD[67YI'qo+=&eL$dyq 1WRssq_RO}kR90-uyrsmIx 2FtG*( MDjf|KK?aOobLPe$\_HXDV[q.J(S0&$CFZbc8 վFZceTPBHc)Jc,"PKG3(7 wFe,Ġ,DlVI#X.NJ4RqLtdC0' I ù:!Sb8x5X%)oYVH[+4LY>~&g)Oa1֥+ ;8v}(-Mb+١I"jZU\î`nsew&&j[*mnH- ڧ®`'7C]%ݺd$5 "uSK\(9j{A+Pit##gI,v@fnS-ĞFc!4,12;6n$?%W6 @u$=f\YE)߯1DCi!ؑFOW]qAO*DL$y˩pe)r(AT y ei1󥄣'kp#A? AUH ڔ,^^dfZk"NۄFtœn9<%k=KE&#٤JHEnE6)iG /nG%.O6;YY 5FF_H5$O7&$ݒ9.IHm4(a\IZe$$|uTIbL"a_~zzSο{-cϼ;UZTxq?l{G.p)V#RXiMe*,d#GO 0 rBC9.P.w*쥣Qn:H b\D$G _@U4_–"+_ŗ: [$ |_څ/dN|iKyrxL}15oRizkA !=8k]Hoh'[{v&k  %B:e"x!?QJěyÖjCG^ܱx#rP,)-x.?!:. * jI&"]FoP@=bƎJؔ[\å~vϯ?gYcr>(KTn=) YF<4 (H5T`N_1_vȱҿV^*?*KPkEDalie8%BȐ~IVh)[cqB' ++J@ 6mzXPh% } NJh 4$.\EEb)yb۠g[05#ַhA (څ\_rvZJ J^KyPrK(k BH܈.-\R%x%mkP".ˎ%p_*@'{Yve7F&d( k-6)%+tTyby.مkutuʸEERVܔuR gw(;^ւuP$c]Hoل[NX t@Rzl IJ5Y;"4Q@i0 m (U8\OA-#@%Q`KSsI8ήD]1 3yBz– *JlOÿ+s>zkna~ˇ|A*Tr'`r{3ޗO*oI|8RNCkvLjl3^~Cn1x"LNX<cjNƯn地o}iZ2h&䙡oS*=iNdYVJ;'(ՉܪN7+p||.ΊIZoÏX,7 lyCVlon A%˕1aW9U:[bǗY])ku$Fkh&\,;?d+ 49bGLɎF=myE3A8I<ܮyMsAvզčVz2ZiߔMzr;Fn9,giP_)W:Xd٨ԣ\cކn<\ۆ'n&֖X׵`Q ./fh44r0jO]2}15XE[_Nu]Zdk{6˥hq9gjq` &7׷3Yd7'?̑SK/gd^oeaA.n&+~.h7/¢˛_6>{L#zbn""|wA>=ߺs$Ѿ$Xwc92Jvz{unsֿs= oeq-{a.{>Sg8u-]qD mWoUnj$>q%\/^ο~smJTJTÔ՞K/5Qm@ ֯EP4Q HTJTKj)Q-%-PB*wk\$%w{ $=  lp׵吒w}<=(J$!^zGb&W#Y=91K&Jk4V\Hl^_dY_`vix]enMvk0Z=tBoxS4l@ h܈#^@]|[^^Inwחۻ5I X{ܣ[VݍW_qѫ~c~j: ښȸA5hk4{N驧̀6Oui3yŞ6ÔqͰbOR .'b Ciyk#.TkSOӦg+Q[1lc'ig M{Jb/1PIbjUz6zxvȻu._ n}poY03p`'ݗX$13y2M 1MbRFp4zBH`# !mp !-hy0ޙMu/P:oqo>]-7Zk>~%sq$vFio[ vcVB@=B/5hCp^o;OoJ{Dhs̸ LptݓnWiL@ L?xl,Xm5}]C52NoahI)L.1=۳;MVKà^N=G;{k5 @w{$;굦uc 5WU\pqD 6hfXU,fؚ5eT헫5;b~afYoN?_|H ^2%qYGjj* xyzU gsA:ѕ$Ot~3s~nu~ίܟT+jhD$f?mxs6C6Zqx>mp)x&<5WK5Kjy?~XJ,$Mjij_!po"]xmKŲ^Ufm?57bͤ| s`]"35SSi4PemhtB6N7V9JXކv@i-HH|OmYM:p2ݙvrJցKm3R8a='ȞIj^D/;6m'ɻ5(_JVʱ}z|zɴ߬I]5S{&<mIZ}H]'`J3~y#3Z;npԂV^*ը>35cSgӼfk͟| I LJL |t*XřdtPS|`rm'Mi?we8bLMʍlˉJr-H2+׿0;)6$xiry8X e[6FJnIOV#;6vdҀV0 cV/BŸa-z}- ,BÊO|`2𴞶\^e)pr^qץBIb [) aT0i;wm;W |dU)I&|&12o bf%;7Ö=4-7kДi+^N1f~=IV9jku-SmD?z?kႫ{22 Ѕ͂t_>,{:g#-KP 9G mIʌOk6sF#6ABٳfa⇇Y~ZMKl*a_9S.O:W\~ӷC>]B@SSDKkRYJw}IKVwߓP;Z8nǿXHV Q"/&u?_\/xc?py^Ɣ71b=;:u"{ݶ,yx`]|>0sI?&J\K'pu.z'=4BY2Sĭ 9zE .ǻ<]Tߎhz/0n1Lorƥ_{ w "]ɿܢZ{?}6J,o[Jn>^_u~ۿ+O{yug@7N?O7-J Ʒ: }?^ūxfZ]qQ/k; 6אl]~ǥRW{w?@4KkB-.Yݟݻ5(ެ_7: nMrػAO7/Y>W{ n/OOW-R.A$h/|_աvqBcWxl~n}yr.^j{p,Fs?Cogb54x7zZLajy@2)JXVP>-{ lX80i l?#L--&񡇸md7K}y3]Jiʟ^O;F\\ْI[,d BSԪhIGB#kM\/!q|&C-E7rソ;00QWӋ)z Wz͋Șٖ`[W'c`RmҥR)pMQxQI&Ne>3VÚj Oy.=!iŌi,k6clD>$,[Wbi&(tK}ӆ!S2PhgB"k U@ hO= a36ČƪR§R)Va9T@O8 -!6;@xFr-?9_12H<*X-.xmEo,=1&JnꪊdXEj|ve)aYβ5\q1ʉM`IX}&0<]r21kaZp *uW͌*.fj`D!']U:TE02<4*E8&*L8&Xeݫ5 X YIAhu2yYgem`DF@!d!n\`)_)U6M`% 0a ´*  VHp,rLh +tEK"gj70xbŽ.0 ku!Ah2DUH vI(mp+ _V+_@JN L6"L8b .N>n#XD5P#*Y = d/p_먾J z*RÅ+avRB@1.٧x0 _5f؜kfQDqFg env%dcVs|5b0QQlXˉls>Lq l%Lqz{̭Xp )bc%J.Xb,`0x Yԁ.mH Tw%NJNzS*h`:\yLBc5O0*K,BZ"<)xPi8d§% [s-# \ m+ hvd/ e䅰`"VFPϠm@M[OETgdA2X?>gy<;*֝bYr;͈&SqYJ ; ,wi>pYɘWƆHXXnz<߁]c aj~^ 83i]%` `% P P: l N{*FgGD+A~"ҀN :@lTѤ :WΨ<ŒJK 1!@5"uS<2 gu]`mkmHa7~T l 6kKOIDjAS%y)qBݸ1/yO:ݧKt$@fJ$)ds e&{Ho,2"Fp(<+ >&@`JY"B!ȡhlͥV 7]?V ^{YD5VYLe3HmJj[~/dN`A-!5ݭE NT M{-9JP0 Cb86m[7#|w9Y_q@n|"_Qu+c˻%XnrY괵.q$Ǔ3x|xj+wMz~vz*;'H)_x[vF9S"_O_{vwLك󒸾D̘:ר#\F#/}:@(. uwNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNZNPTHb2OE u"V:k4 uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP u-bLB&F#aFEh|BF]:J=huNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNZwb+ mvׇ{W?}cRɵr[<^-^׊'zFbا8_-rr192l5JaYj?Nb`0v\M1RTޙyU2C'b>>7>2$A-d]RX3Z\H;E4|$`-/Tg~;pPY ؠ5%p"NDBL~$`y Xf$`)a$`Y~,h$`!ZuBO쐓Io?^ͮ'Cs$bF9:bqʛy[NÉώxI}$۲o*EXs;furdOK/^@]>f'ܳyPa|RC}{'UY|7GgU%WSJe `V^dK/5L^Z%IiEo?1hsx `\d~z}>[Vd[^*:׫r=^-|&7[>xڄOor7}w7>uS?nrijGS}%]D""&3"{~O} vBZ"lЃہ%&(_+wkMm5/fTՕ-)[zrDχя%Y[v<DwmVQFXzV=`dXz֐7b,6kɻFQ]LN `9*v`Չ0R8mVd({ijMр5́`-1g Xo caíFҨ c| ֚2(<}䝀0k;4Dt{yص:ZNhU%$t_K~kɻ[]GVzA"Ry7A#K^k;57lrx?0;H@ vcȝp{{Kz$`-mV{]#KAX;ގ:c1~pk:sV[l)5_M~^\,]>z>Y~q#>VЁgrv/59jkeju"7ɳ2eWGώ#^+pfj\|7䋕PXc%C߫KW?-@gWxoN^wWG{]6w:0]g>1V|sqw6{ۛwv4_jӟnoS trZyx>j\E>;_o l/%7hڠVز/ޝCӫ:M`?u7,/ 7"_߮? r] ^q7xX:;ao}д%Z co%^:ㆻ- ka~n{!}:j/^Y#evܓek&LI:h^|0~'xWope73WGpE;q~Flpg5, 75?#3o# }#S,+Ԩ7VDÕMۥ*2[Ӭf.h8 }._~OJe],ZբBl T}K兜H!0�SPd1>Wo\P)&jQBr]5Eh+QŔE+.TD1?gxvoli;"Q6B6Q` ݪ CqNm I$ Ւ)C6V Y~KMX+,TyᓗւSxAUЌ\[A`ѢkAӮgWTl5crIJ[׌R8* &e`Xh!`L´Xb 3cSJX38f3\[:Eb P2eQ}h9(+^ 0@{,mZˆ& WKJ.fv aȥh&aM:JԸ2#&?YSxUFҚw-"r{ٚ,yx/dp fh)W}wp!dHXlOi2X ͍$,U%XDE(QE*Yq9mSnf( BH[56]ɥd.OZF %´1U1ʺWo ]K \Rf# 3=U[u8 R80Ai6k Ki) omx_UdCJC.0(JGآ{y rkS|*: )7S`b5e Ql0`&TXq%ɰOpVGKdPGBFtTxXcefA$\X&XIn(K\λY"2!{J[SRt#XYqF7a*?%Qv`2Ur9pȤ``g|bR."KA|okL:4i_@dܛHSѝFwa,kiJyX)R6NR`:.oٕM uah^YHISdaKłm LcS|W*B)QK!mÅZ; jel $z a!ی꼵 F7:0|u/&إ("YkmȲ0"ma X ,``g&w?A"{P%ʤm_qy֭{ϭSU+ƀ/J08G!1w5&>h.d[m-e49ac[-]uܦuQ # 2^ BX88,uV28Ut)ܺzKV40U<&qn`wŗ:*X~aL=e@$bB2He(T. U f:Ɓ:SPHm0_ 7 |tXTuLt# VP''biVpۂj2P$ b|{]jk!O‚.&ڐ"b)VX;KnCzwGsP6Db6y —ѵ#De[51E-`yB5Kr ю2*wp :: *N6G]`|ڣrF4z,%imҨyN:):.n6s 2,:QMV|=HD-#;(o *"eȨJ<;* s.1xXːBPuDYllsA.u9K4YTJ Ѐ,JyQ Uu}YA܋UNn)wrf&=]$/6פf C:kjhL ֢1{[bYPQ$u_މXV9X:! aXؓ{/P eX!| `0@R l]"@7k%|(>$z(IKnؓRDj/>طW˥P lC1;RJ5=* hqHP,k vh X/O`92Z l8Eq=Jř0w,~>÷`祝+DiϣBzWy]ۓvq_u [tKX'ugdq\,pu/=p%zRn@f0uz(᪢+Խ p% L (\UV&\U^ %\{SRZ W_aFjT(a~F$/8/=n_OF7՞ϺHMyVM|A]6vu1GʯRmR7o{qx;-Aߦn@`{ݫf9?| }ۢ6MmTb:u;%-5L??ཆ8{wy}E4JxUA_QYjt}:~&/=4_{VZ3=[ں}/ߍΧU]ë?ot]?p<n͘?5Zkɻa@~o`z;oXs:{Y8U\v</0jz>عl4;B9.xW&:'uqϧSywvoLOg%SWl>v/^uUjX쾒.uVjwbslr~ZR!'a~deE쬞f.!M gsgMu2jM'+,b|QxQ:4Whawu{Ewtjl? 1?!Bʛ:c{+:Ղ9(t 8VN+ 2e? "_w/n~ +`-74!^׺%]}sr_rPhF S)s^ i >:xƞV9 1רV;p3G[9x i M!}Q/;ICT (+Iꔃ8fJϋ>\05g+Sq6Sy= Xt垇y%XqJ v7W.KfW{dx*ofbi?̃>M"b_[Ry 5WD$]*6?L|G*%s#1LޡR,ưH7/Z~U?ɿu+ ʁav2)]uS͹#n ;߲7>6>7|a70 TTEsgEOL}=Le$C:L*H:L9r:Ld0gTYz،'dKΦ}<ʲ> o[O76Հ]zcWno__FF+\CF6TUlܸĢZ{>v0^j4:Ⱦ%mO]vկk||gh|b3YVntjS2P h&l7ܔ` ۡ4K\[< ZÚ1>|5fyy;^,uImoKCi fuƺUM&\}3~j`pa3Lfݺ%]6.gߚdY.c%UwO%)UHe%Dt-P@<24,> ^oz[Kx#X#Ij"xKC,ZoZK# X7AP3Y vb ֓czc/JCoJ`D~d2f8L^Goo/{'7 3"9M=;1Ƹ\3bP;ZXSֻ;jծh9 lHIkNй-|Ԟ7VZ0d ^Zx}l`0L;Zֹ [*XQQ\"?2xd\&[2^L{ǰ(ghTr!İD樘<Ɋ֌EI j%R0p:/̢Sqs]Uo''G-`>^AÇ-Ͻ PskmIr1x؅ ]Wۄ3!z_;MDj,m5 NՉrE 6`KِHWRi܆x1y7(/djϹ{H A$L pN(WE9A~u[V'BBbFUV1{!sB주8W)ջ>C' {qWR˼zVV[)j}gS=tq;|-9&*+ 1zs2z|x);hr2D!ꆩU T74#cPi)4FaL#ׅED`QޗX|KP/rZU4AĨHPByW,6%X:|eF4?x1aqdhTnKVpBup(#3R+HZ F }- =5|Y \g4Z.It ΨY &o,vA.[$j||窋j V;OM$3Z$24*0lmV4 \D[Pi⒌|BӾs+>_t&T`W˛&D 518+ygF;1GeIeg/T\f,wߏ۴ \@>v׶$~J(;&N24:,a |.͂UwݚzзƄ@4.qǧO#r5_M~I~7gh$y:SCkX yȑ$g2]r&<ʟMdHXGFI]";͵!yѧ#p QsYjOu&Vx ODK{-2(`l~d>cxFa FE)#]uPYc8;)FKV~Ř_Z=?^gnuܤT S%0TT)M2-w ;U``uL E ^(`5]I^,K|xͬϡ'/rM+NI=Hq@.L]>e.h>#R1X]ԦZ [V 1 ^b(d?P+ZBZʔU뼴 -Esf}gFew8Z&'ډ]9W&ZYćՋ&ʛLO=\gK}ēgfe|T9^{:a2!#\>rSyʅ($Eo1wjV^אG=s҃讷p)ZV,ysy=΀sl/6˩*$I?07!t۫Wq=Q"(oqNgd[[Coy1{H;5=}4 Aw3l }#\a"3l97'q!Syl\" (LbE"x@3RgQ.~< g7ggu|Ecb̊i%uTrUcb%4s.eɈumZ+ hpAg^P-7"OaLYa, |&jOf g}wZn^zn6ԁA^q6]vvKp;E4$ȤHJ;ɜV<#^5  U |S \NꈗQmw\|VrT O,ǰO, |.˭*ƀ4c,0>Lj?Ϟ2n[I ]F׿-3 >=8,Ge)0)wlTA:{&'Ե><Ϥ]3ۜze]b6s hC3zfdžܫȺ>&#蝒1G6?v5]VsHT\Fg0wLB8%a ̓(<`Z?)\c^L ^ kF|r8AQ }/:i $Ȍt ^h(2AH#&[7mjj([ˏn&k0Vvrۇ*2tPa.Kcu5DTn~&|.w(%v5 >o~L,j< g?l8>Գ( S8JD8 `9#I`]x'[Ƹ7ב~~p~ |&Uռ/1_Ps ar3?02!jՍk5︫4WW/J^ZKϻ s+Sv7'2@\[ИA{> f^HMN(hQ` DݴHXJ$C?ÓrX5sTQ `r^]៳z_n\'IG=+]e6c.Ӭ IghTr`!_؈?PՖcLu*>+&QΨL,s(L]X誕2 hm#M%L# xY>\HE4Mpmù"}!ǰ>ZTi~O&t0oöPzۄxfC孩*^G]je!}ťOX95>x0Kv1s떋84[sH/oOW N'DS0Oir|tJre[ejR ̰qP]}]Sj Sz94O]bՌhR,:8(,7,!/΂jt R%m**IFCPsYU}5X8:WU[m#yn$LgFVyME FcwӓIP s%ΚdtiE |.SEo iQ.R"h˜Fk;ZY8P3kifχ V*Qs(.x]cBPc<_a~ J]1Hw86 |.˕e}P]Agl7iM"{/cTPT"<%PsCkV*MVjxQs v/$'%ֽ9-#;OM t%I_v 4)f‰FF@=y P C(%zL@BwrD*=&1"aϓ%3଑]zjHl^6FI՜Rl9[>[[ B$w%OO%E啉ף@4N&8L \/ؼ7VG?~B̌8Sd/2J@b%é`W'e"B!5mj1g\..= ;fQmt/pW5W,ٛfN4AMRk!>NGUШǒ F' ALjCׂ}!>FQ0O3\yX=1 |&%d+P#ԒFb ϲ+nm(@ U:mTmtVʈCC>FהhMWh ׏g$#iשׂG̺ҩU7 Wa]-=AbMp"'ÉIk,icpVFwX74P3YkʙD7M0eehl̼hV:Ѣ>x w $d)Jٮje{o]͏v6'Vq\s0f1ehT|ӮV 0jڇjE5߭9^B|6W(r Y4 C):3Sl5RB <Pւ Xt]-8&ܯKH/xЄ39fz ~Z\8!`AIr*˂Z,pE܍]lTpш7cL` (:zm!Xݏx }x-2$s)P)ʙ \ ʄ4W}8O2` Ir2`uO嫾q~r%[A125^Z$:/MQስ!ZǶ3`fg&x_+6Q)'FLF d?˛0Lسe|/ AɹE7t|AWx_h 3|7>Ս/?UtzFmL7[gYwR_W@5letF͂r:4e"ʹ`]CzpocqwaQ\͕I5 ضlKr/C'9jDŽ 5a\I/עCDŽ#k1s{-8&<"VS2tR]uZ|8iW9F#N~""3J +2t6 K Ђc{a!}Inݎ'\HxBP2f*R i6&#q./u -8&C(ʖg鍳O!pf*iVv茮#W0ʪ01Y{6-R*4&ضOlIry)E>I?wΌۼQ/PqFa 8mcCp&.{D/!!$!mr2/8+ DYfE6_.Cb+'zNb._"ÆzUyKEa䈗D<"5^C48MƝ"!$# ~;85p*@ =]z A؊;W_͞M[xH[;# "+K]j=Y8kـK5XsdfsWS :H=KBB@; c;|30ϙ5 rW_zBi-paDiau^xIAe9:XbUٿ\??~7%̇xfAb`-8&⽉m';g/=0).e_qN/]f2[p K-@-/JnխM`ǺPA֞;0/'񊮩At"ưNv" MA֥4g┨(6(h1м?$vPwP/%ˠA@k+yY]Y1]A5'*MsBg->098+[yP\x`2TT]T8``(r7Ƿ8JuJPTAG{0TY8gSA*\B $W ^> 7.CT]7Bix!. ) uL!iô=\ʀӂcX©r_@׶(8ٴwPґ ::Q U${Ps$ӅYɼ+`߼8r#ūdc)i/Zk1<f+%+1WԺ{5Bsic\P@ ^z4T>0KŪy^> 2(^x \>jݡ+%n10q9qtpz/8P>|٠za4 "X)-eO .>0AF4Ħ,?<("6g Jͬ +j#9y|}y"1fM #-mڞyqM ^%GpIbDASi ]Zp KB9uFkB9L‡VfbF3-faY9G2}~ 3;Wk4ʔ!x1^0+bB^y?b`p<|oW;AT8[~Y{[vMMmC20y{x;tCRRFօ11 &ӨS9d8ߜ84o"ҐV^Omm-EvL5A1] FN](57H=Bނ/D)l&Pc̺&rvo%{LGKu(ͱ]N9II1Ax顤*aTX{;h Q cnY 6꩜CM&nN{G/sp{o$2̀gFܾ6U(UJRQJBN /VK5ACT#edʓ]%g8﬉՛}PgyF-|LSLfERa Kf-w%Z}y X"?%?ޭJ[&_|~ ۞{]WY}l|K#񐄔oJo-|,FC6JNnlo߆n"?7uҷ-w nD(jʯȚ]_ ݃OP]bAQMM ϴQ9֯=ySk5!lCc税$0A}MM{@kAV2Ƽ=ef37>m}hKZ+OSFLz MOu欦)GR{ګk):2^ =D/$s@{^8􅓒FpO6{-Y7~u󩺧7Ilyp{j҉]ĿXZ cWKxd#KN%#@9 m).&B<(7%Z52Pk(DhN"CN0&*+S g^s8bCH7q hq8*"֕fGCvGh ݝvzcpZ")[m̊iH_Y^lvq}= lFcv45?+{w`]Pޢ[FE,7` Pc! 'F wkYq瞆ON`?'<㗞:]xI c^21|QG&R8@e m/02.\1a^06ͽ`3}Yy.Y`}v9nm%'tT/:|QGYr k2ۺic>]$g{Gn~e0Ϧ# A8S v4V-ȿʖles^ bWu^ZD.@aA u^[򄡙}$/{A.^g@ۦ,>W)z $Eփ6Yu?wPCNY%\U2K#)q$5I9nH$C: SqNh}M~:Ƴ9{ǩk܏|s./T/붹P>C/S͋XF0顱s5틑m> K;cjoSxU!$t2?=gWS[Hwv][ΡHApQ G^NPn%։&9[th``viB1c49N@^~2E$Qg vL jv\bXZa*Q2jKVqH`DBUOGNGw׬3&,>]^Pbؔ/X\Q8iXĝO%i2 O }:y@"/=g86!g)^:PuJ^#%Ie1(AV1(Pʷ뱑!sc/DT_ bT[3oӡSIK?ZBpzc Dl<#nT=9%XǚF$WL" 0z+HbQ;n,$puo#`DljX{;MDT&:t|_j.WbPY^O6'(AM?7<8?F/ՔsexBqN)7uCJ"]w.T]eLj bDI 980<ȶx>/M=x΀TS4=zڪ$ V9Az,S:~$9buCaF{GWiq".oT]rY=K\T1`NI^S"Ho~urmx9gN{Crm*~g,.w`@*]uF6o]:mGUI&ܓ1Ȇ4UA\^Z-,E+v}qNl i"gFl58~Ύ[gu{wv0|xÇŖEŃ*'S1LLv".U aP,ZfuX|SJi*(|"*(jiڤbvUePl6j*F: URVE 9lfg굒o,Εc"w6jm8Ɔ]\}S Z;U$={{>x]Wu |V‰U u |Vי9sU1Pĸt5V{#p!srQ#I?t2`[e &sR; Ӕć#%#H^kn!P\%%&H4# FmP#gI=1j|g"&cN\-~xc2w*nlUSupjjn_{ԣ_V{6@".$K~~~[-_n:0#26'ަrVN+i:=O@m/Ys/N&Y@yp'&ZvNv (qez\kb? lWVskZTZԎfϯC#.A+Lߒ\>PݾO.Zx@-?=դׄv-78p~:-FQǢI[v<|xt,ȻX,s^ݽ۾@ (@ps>uNΦDpG'ܡ&a+6h}R,r  E ]M=@%~k5z%9z(2w7.΋k1fKεzzH[Vl5IKoFnYY[\݌0/̙_ƦEA9H`2G6/q]2k ɏXH0#C*HA޴׼,#71P{)5]e_~` ń@mrϧ! k#QY(=r\k>r Q9o\'Rw/;( QrdHac1h5R&ΐ^pp<'k vS`2ٿ0R8)yUbT`A".Zst납kj5usLD7,p,zDCA= Y[F;e)[bS(f"PC+v뿦ɳ~"f~YZ7Ƃf}2` %]Qs^+ǍM]a`:):3U=*K5%ܖ*T!s+哙6|+b#\ Yx$cV0 #25_f&;ƀ_5l|yrm .28-&}U@~/JqFa{i}wϕibxo.=bZ]7P \,# ;FGN6~k4H S\cE z) -_({}J+ھ>l߬nx5m_oltl[ F:,HEGHP 4U6*KHK$1Ɓ 7 mCJ(!+Dɾd7{{dhQ9Ju|}GNV0 #G [(sG#;Mʓ/-!9e0oLЩc AB`P14L)A7$>%~zjqd9N(rKlQFv/F;Z?Pf(y@4xGa/l8@Fi>k[)KOևm J߀ 2X,\ c8F)'!: /hi]l[ vrld&:Y/ wͷMc~]^m-l u3űՁx`s3]$5)% Q&B5-tFYWm0JgOE≾ǬhKdi0b+PAK0@1 J#0NF ]ixbe^ā:=0rFY^n[ T\[JU[vwb} jG)":,|HD)DE@c9NWk8jVZ+EY[G>[|,ƌ.+3l>X9+SF(L`Fbr]Ą͵*}k/}yMĨ^H熷`5uIm{#MD9a^FV0 #R0;4mCE$IȔ"@ZX 9QiX{FF}0D%ʻ~}j^ZfUQ9t&3o-Tz]"nD5Xr`s֬AgDG-*wadhKڊJj,$Fх>f>AZ" Ap)麭,٬ I:9c`ͩ)Hoi$t*,ot #6=0rhU8IZ4vLsFai"<'Ѹ?m ;AH={I)i(\8jkcZZ&c%5D.@ana0ykeH Fa}xLԭ-/>U_G^;2* th_'&E&/}I|m+Q9v`_EG/^!ѩ'`_g?y'`FpŎ;o^]iŒe8TUQ{+ÌVTVu e;f9G淑tg8(&}jQ |+3+x\ӕY''x@9M,̯$FD]a 7\qAk+9G$ IDSLJ txMM+( gd?udm!*Նڐo+nga}ָgѪ׸ [ ɩ#2 )ݦMDN ӽ;Ó"3#p0҇rV0 #w̜|nю&+@2sBaƏrrc2hhP'9s ed7v +isW0 #Gn,/cӼ8zFQHAVYO'%&1D?i)qAB#pJ%Fт4H䈕Hd Z|#w^emYI؇F _8UQ{$%r,p)ץV/{6_fHlُoBNζ|z8c; yv6eu%MtfT_93KTLM[9lYv<ǚCWou_pdf>֌pQ30=#<ǺI_GsD"g@EF+RiIjccYLk9Sr;|Ir2 L3s5cb- a/B@viWt6>ǺYDͧpw ^dŜYVEcYY4v7+15ӌ3/`0 U5Sz)%<∲ c>֍4)mO{ȒC"d:yb8Kɗ9h)r('"P؞wGK %`<@37b2}:+9F(43w =9l,c|A@ Zà3,8`!Eڶ z4 0U[QlYZ RkَAAfXI@I˜IsӧN FbtSGuǙ0{88UIJqD;fzFH,mcG]KC\Cy"؅ܒb:gl`0]_17!Bp2\|Ru|9,2ǘ`UNf~W,;lae*s5hѲ#TQ 5.Y?-Z0ź]uO|6 O܂jQ`~_ (X۵BKT&g:)0NﲋB}^/^fDPoV8rKĒE;~VT NEGk;» 0SI Fץ!~Q!r-YI(P韘*x~xN1͒m'' PljUd9/N_HꋽꋃVuʂ"3&_4aTMxs_7Ƨ-+0BfLIb{>r"Sq*uGc'Utu \ 2R#F1j->Dn(Uq//CeEV}B\"A3Ծ&ș\T&K˷~d&h2?x 嘝gȅ0#*xc2LkVWmꐹU7:疡6mMo:e l3ygY\#c@g,tņ^uɢ.JhQtXwnn}mj9yc+=ʪ?o|Q<Ս}]86tFw{^1F;PUuV{qw|9UըkFsilՆN|KIa_UWTTɵOoT&v_R0w^Jj-UX<4pfXD3>~ԟlr7>wqB*A"ߏjk{ek]ĥ8:[ŔK"WK?N;8Q{T"_ }t( w>H1 yȄXbxGHWRʩ07߯p^| ƫNku q%ZWk a~?~uz]kw]Z-}G:SCU;8jAAsƊB< X}Jֿ@:hpPjzJINC^ڭAgg\ʐr i"z]w!p\JEpNȍ׭V])`O>GKX$ u"}{8KM@na[[ɔO/`x,|"bPN"QaH4Yשʙأ{^}K hfᕒ!݌. ϻ J%dI)_N% u;'7:A";dLlp=B=[,db7B,< V[EO1ae'qfS?̴1d?n090<@qY*"_7qX,B< [/fSS曤#gr#L7$K l[.srsrsrsrsrwyѺhgp҆f ' 6Yzr<쯖Lj1n0;q"qNDRΟ?DNH3W<PA0XqqU xt$y-z"t*>*`X Mױp)(g+&rVQkuZ Щ5vF{_ z90o*g8r[kE LҊ+\@LnxW1ŁYS<;ܴƬ>e]?:S:M'ö1jG|ssUx*cp:P㧑4޼nӸgiYwfl-[ZlmkٓxJջ?Fq6޿%NdP97,Ȯ;d"8ڀa.ӂ %>ɬTT,5-Q{F_/pFg'Z`gT-.A _YZw=Ur$@5N3C#ל&;İnjgyY;.UKLbPZ(k*@pmpFVmK@-!&ݵs6tUEz(Mh:|ŽQ XۍUQZ&zel5ZǠO,pYZx }WpF\ipocq.4=]\3N[Fۋ~:]\ED\m}q5ߗrWZt1u@J?p'Y;WZ|JCo(Ln^DPڹ+$:Jw^p,׈;ՙڝtT(+R?Ù.`tY}x&i>f5:߱zUVCʬ^5hZ= X[]'Ymtw1_吧^)j)}4^a@ v/ Lhv$;ؙ(pDU~iU;1l#[ ͺZEV-ZHn=6h#j Zl}⑕D@GЍf6/m齲m16fds{ܟm/RDb+L"9>Oၢc=lj= #yMY 0"5aW{/-3>&PJ~l7_xj6&xcMPdЕ:#L ❵VbzG\<¯7?K AN`,mH{>5(oq=ۏM4m:TL6RسnwFud~, ne-K{C#}Q-XuiB[O(kmH ] }$8:FԒ9 UQfD#(25ꮮUuuգ_&zBh6$0,Ld2P"[{d2sj~V#'2Ɉu[ _΁q2JpX VphH0"st>;7z~L~~G̝ s<;$W&as[#!\xæ) x&ۢ߂5a}^焵߸$[fvy7A6ͦsK{@oYPV\˿.3o-. E;ayl,Ӣ]պۺ"֝uUޓR8h /!bkgb 4t).0cŭ7 r^}^ҸOuYV"pT|VN6>쀷Ow={?뽧B{tC%1DwD=6}OhN ( 2̮LFyV=̥1 \evp23*&BIsMs!7= OVicΝEo3o60 -Y=66>;lj/ۻ/ۣ;{6m|kƷm|kƷm|kƷm|kƷm|k})eZD0JbFݢѕ^zE9Xz͙TY w;%u`pl$r8@/ / \0ϩEevFȷ)yvumU:Z6D\WMk}_6+| M 8}NeGwV;{3=⿿ӛw|7w{EruzN\L X$Wa6^"sDa| R Kd37uk>D9 rAr~]h\m Y4H%KG>#՘3| 3^j#2zhC"pk'Ivk'Ivm ZNwIV.!qWz[r +=dgݟ&l&yќmygo|=skkW}1{3[{v^zByۇ+WG8Z, d]F z/нx$#6gZU@<~|rq]s?/)70 qv| 7Ob8 M񭟦Wpx/|7v 1z!돗o1 v/x@׫\qF?GiYluaX,RNF'l Ջ"i[FlU)m;< buc}: b[EN?%zkyf{fgVv">2 h<ס]6 rcເ'vNbngdeҞ|˧xUZ(J.>/V* OW<'d.eY|KfK^>dZnhK>|5[LkQVKQɮ@x,>^/2Kk\SxU歉Z"EQw(sĎGSGZ'p`#H[q5[0H~F*>y@sXzzqy&ҍGltѓ2yKR9)/B70,LQ.C)\fϳ(ѬUû֝jw;ܳcOTc%$Cvh>gutȮ #)΍B/.ۦ:edM>e1ӑ,qe-[q ~8OC!Y"umpQW IjNum?A,ukvxFAn@c ڕsתxX/Wi| +*(fDM5CQ_DF nBW PÅ%$$^_F!xi"PL=LA{ҕM= ϗ`3]cɠ_[z+ێ85 ?AX9-KA8&as,DƆnSsUדB_WiL&PT;bk&x6kͶ@H-TM `&֤ЊijRhM I5)>E"8=7m=< K`㾏 uhGq<4w8FarpeM'z康s!j*x+wyUba+s^28Mgg߳.5dc! peCH$ܼv/_W^K oldXuT ʑ!Iu2ʌ+E  2N+.)^ $f"xD,cw} %'+-Kd@iFZ*(1'@yYJOU'LKad0šI,| %Rɳ.8h^ 5%Di:A ZZ I!j\^ MEF&&:| %7\y\'$R[͈GVp"cfP( D9i%8qش)Q6m" ~~lz^@G~maz}t6ao ^WAJQ5!!T@[4 1=Qha %guP){pq;H kgx"Z3CJ3]"DBQlt #o3Knj )K8ANs3CJKB4DL 0,Wn4v߃B *'fE[7B85JͩAtP(!Tg3 %N%+L"!}LpIGP(!℔KɄ26$"-D(VҠ:| %K})IA- ؟(mcCƸq`gFYpa'B Ɔբ MEjфZ4MEjф"hq@$0@!#(2tn %`(esvfzri1`zP("fA8rp(`- rT`Ѹ#ڇB W<'sicLF ^wD\P("^(*s ^rڪ1!PBx衍V3F* J-#j撓h6ZYpZ'e04|# 1);:Y %l) eFQ|ы3|JIaq(ԇB P̨*XBLŁY@W1g i81RЎCp ^` ̖s`s L&%| %'0lũtȀ¼%\>ư`wnI E7n;_<4_,(ͦAV*msp ElRTH; %ф_sXOn6X2Z:J L<@9>"!$0%ΡD#EဌyPnE=DzH ]p6=χ{tT@Z umY-x-9|㎭@<\OSt@sc HkPh)幱PRKmPr,;&J(/`"1:Ɖ"8FH8umCxe5(:s^+pmՂ>J(-|ZQ lu:;$°6vԺCF2^ZD`Bb8 +O^:;"}(bK[{YRR|+"yJ kءMPBxM}`$5@3$G.Ȅ)@Z}(^ڀF.kڀ(mm@|RR:(1Y AڀnVQpӇB H +c s Jx:0[Ĝ7>J/\'Dib5Q!"$CcGB %xF V(Q .;b%H1ܥ}(kj<@w_'tz6"lW[8\o~؟)c(b03 7p˶ܥ}dKي粷S3 s -tr5s̥)=Ę \݊a۱+R^q+~ sūw 6|P \evc2=vslz> ?xv=srB\Ҋ˩N<0]moF+~1%0XfIC2I6c]dIdO&AUQE˔DIŶ8j>]tUkS:'wY o>WߍK$ڙo|7z Bv-9RA1<)dm/$.b?Kd+ecEYwk[a1zuO hᄐi|gћSjNv:o2ag MC΁/gQ̜4Z?ׄOߧ?ܹI4'sڴ(Ǭ@ 4y//M#PD|:w!\~`)]֕cG!@ _>* I,D%yC9(k)2*WϢ QmpFka5_ }jh(-o/hSf]E_-^}MV)@uh놲KU}sH6Sh;bk)a m(71eK,005,%&GFշ`98G4R݌x9Z~ZJ_|F=ɛ(nh_ ]ɡ Lp{^QES@5Ϙ5oq0N69q>g'a-iIe(K)m{xc7_0O _SI (4 r0IX(IDk:Q>4yIάU--{(;$s]7koyF@vJ@isWSfI1B8*,59?Yzq9aJ7am>s.t:]fk>v.7m-JԣS K{a~⦸.խ+mՓrC^ r|Жm}~Ҥ4S!( hHߦrkSj!H-6|VA5*@((Gᆓ h]"ʊB/' Soe5ĖHt\caOnD +TA{]tz0@s}ZuAW*"^u'jvWe~ N+L 9-cߪ(+KOWDW ]$ŴG37ʒoe8u#<> Zd$8"?F?[ȧ#y cW\+W[}H'ph`2-τ^`VW-[y_\SfY~[YROAhe_(`LEH("坏"ʱix9a' ` 2 ]!ZLrJeH8lU0tp &h:]!Jnzz5t%N"]0ap홝RqT{)+ձSODWXq ]!\B+D+M Q*XL؀ [ ]ܗ=$Bv=]] ]qA,'V4*P њ;tut%SFDWXS ]!\uh9:]!J֞.ɀJb}pBCWVwJAYW؈p * ] v%=]]"])**GkGѮ(e}Eg4U>݇4F-Z!Bzz#p% 6e(yWU#9S!`NX0tpyh:]!Jާ^"]Y)MPXCWWc]!Z#NWҊ^ ]SԊ?z`}3ky ika(iTЕةŸ ++++D{T + kGm5Gy5㻷}E׵TjvgћSTr-)h Zk wB; ~$aj^9̕$c(geT:˒g}d l{CH1Vҍ1_1F41 ew9z:_ܺM?ےgP6RK(q4rEޯL٫o;rqTMa9+o27U r竱 pu#7Q6<2rz:]JpnGCPhWh|$eٚ2g&a)!k52P=Ku/'.bĐ"%>ڊL"p^ZD#Rv-MLHd E}Im9}A6Jv>ԤYW'M3~r\ad RQ,_zx,m)JP&hE%bpn% OJ\V;ne_jq1 V2SSfG-8[ ׫da9ޮ.מ&MXjqwӵsÌϴm?kieQ׳ mF `Ku0Pl(PD+TOѼ@(ؒTDWQ ]!p uBRtut%%*z4` T +aGr_yY1u5:QI|$,7s<4Ac*"\NB1|F"7/TԚ+j+I0tp &?Q޳D2Z~l9`{j:}G Qrҕ5eO= g{:0"y PRnzz5tN"S NW5*y0Lǒc5*t{:vѲ(X`  ]!Z+NW`tutŌՖDW( ]!p Zu.e\ [)+I(th:]!Jz@V pAk!t(iOWHWEBX` ុahyCR."\;vn(:S- ox6 [mhtXbj8tF0xS\ߊbVǺO27W0t_/Hg}4UcAV8#)T,HD*g%ik?EJ]|C!R8~cv,lpa{|/!yWޏBLJf>#An8g#Dpuz 7#p9i?]7ECw_ͤA9+].T2b3e•s[NjKHR$N:ћZZ|%񧽟t^mޗPˮ2W+JvwUf0eۃ`;0@؏<HE*^x&iB-LISnMrh''F27x2ȭ{9`f8a,fl۲٥5 dp,f2O3ze094`e 頿dC F%V ٫G{Qcq:|v&!Gj'jDz paM:yKiIOnAU֯c>8of7jeu~xS|HF/5,P+׏p ~:ͪX>v<%*_60C+d~4?]~יʪ=|>VQ7IOLF pNǷ93#_0oes@rA, rAM|99YQܭ"NWCs4쑃9WLxh0%>Y~ ۚP"LZ[ySq0GJZA*Uc*cy1ԁ~5:ڟL҆i7Ҕ$3`=xBy`[X)bHEYPyl5aیGv R)S{6'[_IjLcʘVU֏9܇kG72_1WQȂ5w//Ycstmb>Gys{}X ~t=g׏-̹4]sף;,z|9敝UZR|UżmsjUc7j !#tHǤ`%h0[M"&t [췚\VM8EzXD:6yi'9xlB'޸\:)Snn3g+ !;' ֻ`]tA\zD=,OЏU6riG[*O:aߣB_ί]bQ-60EvMb4꒟n.|8;i.o͑롖]uOZoQ*56u_>Ka Q`y3+J߼Yocܜm$jsf>']0Ri͙Brc.'`%T uL;[ a֞\koPo`bfaTC岶Է滗6fe[۞C^ ,{Kø)۲z˞ԭ^ʁ|<"k*m>ّlRw$_US %C;] t^a1"S.I!*R̃N9X9ό*ɭ5Yo3Ľ\kң<ϋ;G?вǏw%l9lUӫ [br%/7H$HEI0)8 ,~R[<`\OYL( X G&ãRbf #I$JI"lo)'0m$#KIҥ:\sl.E$,SFR<9,I4eZi<7bN4b6\:n;G7Slwd [w$@o_8뵅<9Sjjt6!VJt$6]`vM"ZR~4 &oU822ry2EF >ca{idY=LJ10Q(X e[J:[cMwf hъu8_?RM,Ж)h\p!Zkݻni&ci1w6lAk>>؏(Zl LW\m u-1K`K4atNm/jD͞Ӿ^~{*4gWcXku7d| #a c&3$JˣK 1ӹhQИ3_W(:ԨyZ(?{;*ڢ?dmO(h/w7vpfIvOΣ\'PlS1X9; smU|s4T(Es `cJyl,'Bqz;FV6LIگOͫ/YVu-xl&5WTyKr-!Z[')5Je͝fuRt2f @+iv̶1<r pרcg9/ɇmqbTJ.XhhM`2)mKkbm`5 jg}InT8WÈ}rOD@gi'X'_/6ߚdz*6!93M&v>G;s+Ƭ*2` `ёN >S.ډ`A֙/+~ E>6L`%R-ɞAE񀢳|R4ZXljMhnő#6>-ZJ Tj5 %>,dp,C^m {AicmM9 ,V.00y3b{"7;Ƴ28~pɣ I q'XҲ5 {Hb uhP-R޽E% Z R䜖;1N-`͡нPXeK fC0dN6`\xIK(X3d*z?C Avkz V R +V;G 2M!;mȠ@C$%y},,x$.0LX;&1$m4(PPg2 \ 6"l3y6B@(fetM=7T g @QŁ.єsعP/RHȒI oF=AO/:+_BR!M·8%wRW^K<3`Q8P- v59wA;[thoF.M7,g&1X@Qx$Da9Y #N`?)`E &~i^6{d?}oT7zl[ߵa633XKha(#Ky$ l;@v9+Yg g.ֺJ lRâc:,Ohv=!- .b$؃kI WHa\NǻR-rg&c+3ԭo2h w3Dh4OK*YȩՏ̀?o FbɣH]C&{k_6H jwu a|{^9߬>ن)!lb)VXA8;KJOxC4*G.x W'& 8g5r0 j`QEDS)p]}+ƈv( Bx-+.$-sXT30cuD4+x ;Lg2БFs>nP:L/dM3rCF!u"7bpntg?EeXNS cE9aQg4h!0c[;Ѹ{jj-[4=>EG9D; j3kETJӴA.u3{";BZ2-jw$a7-0p_Sny⳷% Ct(0>{yM_t8ٌwW7˴'gs8{݃ COH0%ZF/Ʒ eh0*ӿ{B{j@eG kiNV`΋E˚%Aa{7a(_ qYaQC^Ç^vtsf%2VSu`_tQ˵2oTP=`G&\.@z|b#(!='e l[+tONO`E_qh"R0i >y8Hw9-ѓ95TyǴaxL 5x HOuP3fXawד{2A,z l5(?'ÑYMPau`h5W$:lA5A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:9ؠkzuk Z=#(}ҠS 8Ӡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hP`:.B)j:@:@ubPG`Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A hPG:Ѡu4A ~P;o?)?]\oh?m.0mld@% L7惪7ƨջD?m7=jE!oSMٕ0XnyM8 vd7GGWh9.oN/[G˴׿ JC2ݲ3 C/C}fڰy&\#zY:vsZd%Uzu]CvT~7#yuۃ-qߜ^^e8|yrRϥs>_V|9LֲyG}ؖo]Iեl38ts&gZbǿYOwWͺf`݊^ tgajJ)Z&`gֳR\k>:] ʤg"]ѯZ[;vZrX>񜍡$ z)gp&Ѓ h?X P!? ]=y|ر&HPout{+]}ۜmJ++vM+ZJ:] h ]9Ю+Kq-t%h{$c ]yv֤vk=ti5fhoU:PVS+ѕ~5t%p#m:x3((oTz:tYccYPcX^}J1{q?!s ܘ9헾;dOsh8.7: )h~1W7Uxnpps s9 +8tUZG%gOfcݗJ(] ഞg7Еv*].t&Dsqpk+AXSu+~n}^^?vlqo9<&] hã(!]TUp++{ _i?孎ttYWDWjJЕ}B28HW)[)sig+Qgx-迾u̶\o~n䆘nvC G[1#JO/NO=WW!Oo'\rr sj{5˿::~t$?=Nǿ?{۶ \P@podoۢNw6A0/Ȓ*vE=CJ%dѶ+xcf<33 F}BbFۆ`H&j)+ |)Zlix@마=Ursa_x^+U9[Yn58“e$ga2}&w@} 쿃$M~vTϦ;ONMs~PD3@H}=(yߜolͶc ; @/YW%C'Iro"|N~?'j]'y]8y ?[Q䍽䒒ߧF.o]r^V=rfG;9ԋ80~5:W|]Z@oAVѠ.ڤm.5g59՚# vͩ8i6\4Vޢw lsT6!*xr92)gwqy9۱y 671G_1I5:n^whbn]''&Yp)mdDONF&'$ZStO'q(uzQuјɁ)C*S(Tv džJ1׽Gi+Қ:VR-Xۤ;&7mEBHVq 림 ԙ[+Ƹ>u ?046F^Y[h8iЌfbWgڸROʳ qGSxT04g6as"0cg(0bR.@ +>lKbnO]ɯUE8&OqJދYPŚu+|MDB٤jHD螜OMݻtY908z4!rHUy*k,fY+7Zi+wTyArN;(1)U{;DZ*Z~".$@ UTTpZdxܛȶF~"PJ +Mk[np9"m1W-n"J"1Wxá0Kj;ꑃmtߋujK{oa\u1+,kpnhwsQz0I5ny}7W.m1W-n"}Mz0Wb(Ӓ+p{}7W%Ws)u~Qvpjh?(-\=d&- #\ښケ(Ł]=Es%T2WXcsְ}7W徝>1WS'ex=YL2;|Q;Z6YDƫF5^5{? (a͓2$nDy?KnDM$E!t|gQ.i<uT:?(+esvpdfE-7^Ku=װ5u /{z-^٠,fɋcN?SGL]PύvS >uA}2nwǠ/'^op=Լ9V_NѯHIbndɹ1zsu英?W? 3/+ Yje$dqٸPV#Zib_Й1+{ɹ*0s IilDžwbvMп2X[$}V|2"ky֪Wb&eiGwӱgUwx7t }0L?{HAG]ݫkqjDCFjZ[>L8 J (Ca=CX8=s*!Qe nq-95Q0!S&JxQXw>%&Xjy<ԏ gUO: 9_Ae+TcɢL|wȎ欼`h[̥ ԈRpS_ǩS*╢5GYdopvT3, %Kpe暈9P-cfi bF01Tq5 -Xc3M s3jhƩP"AzB%6S͙ F 8$h?s* ˔V+!A$V pc.P g?<$斛^)+E?ZzNBR$f\`*} ,K/Q{U*gd\piAEƇ?kD  Xe:qK 2oH ~QY57HaA0#0Ԭ">.LWM>>d`Kr%~{bH8ygDUo|?da$f)*kRTwZvI>ٚ//܀^">Y &({'X+8"mn!IA)cpa{; NtlJ`HV`M@ZX.*R%\:>sQB? x_ΏミGr1|N&7$^}1mZ^n&ãG)+0G8dYr.(%sp!%3*Us6MNד+Ӄ__'?^^NΧT 0L ^0=;/yV&@?]FC?;97s{!TWV51jHm5jr CۨAq`~:ɚƁYWk][%t*klj0ӑX1`Ѽʒ8a};837X,T5 ߊºqsoxɫ^ǟ/?>|}? B'@FwSMoVGY]W|T)}te߿>|2ƕ܌P17}]MYPn NPc1MM6޿/K (.MU!FUTc1"xHA% EE p08#/7 _ƾ͘%AF=iR;wf|4~βJAhYa %Q%IF\1P{4+h#44h5SgGf0U>ΐL.7cMbDt0.qo-δl4''w]R8#|p3Lzh*e+ai`;,B55[뷶iL&&FІ"W 7gey[;] ?/muI_"䝲 xF-Q+8a;P7֢oҚocPZ[D)7!i%?˽Y Qv)gTBXjT ʤV0br8Ӛ9"9M sP0vY4a3 #/g)Oc.SknXesv)Fh˟2(xT!/ZJL*0"47SBQ\Tb~gu7_xh6& ZlY9l0N,ʎVk_[vlC aB"@9ďG%QצǨփ^EGQZ,[2ǸNGq@ ɳ78 e)sH[f2C K I~C`sk 7 4 q IUp;oB!gh՞:e3 NVfYm)q"뚕ӽwU?^s? wONdv50tjiZ)ҕmjBV^x[ĴfOs.`9NF^8 Y +QS-9KR3v:X~=z'A۽Q'uo(Oō.8uƳ@i5=%1Š_3#S3yILKlCm>'7%^Oa2x\tG9g LSB[5u|72޼5oRC,|C Mܓ5N&#߽T|ܺAT3n*"7|(&(2ԋafS 0FHZwo/|; w$ZϔNbBH9e#̃sl3d6dY<y_iJJpp^H~2+T4hiB$ ,Yf_g>FOI#tiB4aIb۵V)bZg9ZY#e4(VrI 2k/& Gk%ۘӝ_숆 am2ϧ0XR2~DR}BjlƜF̠PI(V DHD2F4QAJpDTBJ(26gP/ -A #P 6HA8F@< 1c0# VTJ)#hR #˿0%Jifo0xwm$W4OŴTaX8`ll N0ElJwW$EQ\,[<^ޡ(T#$02됓j+8h2 jqg?iPr: G9,QH$"yns*ijJu4g㠪uϤj0[ W0sLpC@QO=(s>PK)SODT3e,:S=dPA@cϣ%$zВX 2/^ni<3+` }v!Ogݖh壯:?saw)돔< 5HTӀ+<`Aø5"XZڒ-Jm|,Sd Vywiݼrxo]p[Yf $AtVnxZgijຠ K]FF`E+.Jq`xr 8M(rnSVEwt=rX+/BHt th((0++OYv\٩IcLmwG^c%m߲4a>\ioœhZ$r:V{H%Hb'Eu>k|NĥY?CҵXG.QK~ zT2qXIIO=f{Tq!df~/?h^u^JW;^w̝T#m 7d N^|x: }JոOtSTklQïgB6žO@qGoATKELS0*tՇ'}HEz+,x_NJV姾vzϖc 7`wJ`"p9+>/S>T.f 5>izMb.8^Cpo=8tčMi'rͬ6Xqch8wp.RC48ʍ_j3ʹڏXJ@]_o6^0]'~ Uom,~aWUݧ0rcI>s+6=MxJ/ф$[ܾݔ_MzpQ}hR]J^ZRc HKr@Ni9 4lBBݿ%˟Aȧo\֐ft#_BuvV\T_0IE<~ AR & z=K OM2Ѱ_y=F㫴֊\~.t>HT:CPɨ2a60x[{wsی.L&I;M&Ă EMoUYVF s,fw]1!Ɋ1FdcW2cX|H@#[6gڡĒck} GoЗ_G y1ۭєؤ/ނq4:[ٝZc?h8"`k"v-k3*6؆(r42axO2 ަ@fC4I5X V9(u+ŭoyѪ8iDvzt==ҋ[C4ߑHyDZ Y#r"ߺGn~\'$.gWL.^,G):V2zȉ@.*5!씞uRk54u1^LT2?d3`C2NcʠrvjLz_'i7S:dX\(wp)BSϘ8P=l:F{4‚SDnڑ׸gЉtFƬ;(:&k$d;J Z9=l9ð-Oce?vĴ;kyRKJ1 ܷhwWh j>dLQtO ߵ cn .mhks]ֆK-6rdk#.ٲ?xM rH# be}0z)#"b1h#p|1W [V0ѡkq:'ߟmVqt5WDịUh4BkM̨3hQ9b0XJBԀDL*t9rETZc*el\'㵻`xLL(:>/]xH)&H1 ;3+F"v k w><->AOɋڵK G̏N8m8CxRXdA*.QDa=K=]4H{UXB"ReNS8'\.%ǀ Aj߄Qc | \ԫ(Fuݕ+rZg3=?ӳUisb 6ߵłi!6b-ܦ0>*(c( -|)55!̨Sg9$0 _U}aABΩHwD@CD3'leXTqzc  B:M ZA= 1 &W ORNS&ZC:H͸@pZ,K/Q{*3a"\Z/_ \]mMm I0 U MG%%hn‚`F(aeC֐3 !z^R?bhg_N\ҹu?}sB$\:K'"ø~I5(JR1Nq҇1Te:5^?rSD?_yLeQ> dϮA; `G#L9D'Swb*\I5E$u9yo(؀ H ˥CRjԌ\ 鹳v,iQ3<4>} 7ť MWD/_/.O_*S 3XN\O/X[EWWRgOT.?w4t|6}>ym7l 1L5kb. {R}۹"P/W(͹̘W U#ir˿lUÐL/f^Yރ!0kp0q1AfvEo`\9*A:QWUl*49_#}Ut`}Op1l"cR=P:aAvz砎~z۟oo^ uo޿} *٘7#@l@/+~еs*_{K] o /.Sfqn"]T.Hk~tv^U#V1D\2*,8A]bmƥ]NyŸY |K 4 \xiC+#>ƃ>Ҿ1[S0ƂdCb8WOUcy4H)ʝOcMupxgc:W)pMPLDB犉@G5WFih<-NݮOG+aYC mQg$^!i;DL@\ w徵Hg:q`</5Hb'e?*jun[t=HnWH-L:E-9ji8B -x+[oڕwQJ@RJ/mPqx ^T/2Lr!ܶmZ ?;mZKi'amI4׳fIO# k!/zz+TbKn6"Xau>~ j6#w?nF\{ɭP;4Qo\)eiaUSywșWWC/L j`/$6VhaI[3]>a-?LPx?J։bekRWF]49y緺"tMPj4% P]]5aE5.nA/9v'm,nWu/]OEcN4'QQLPne%̖ V7&#'?Ƈ(R ȓh.SgJ{'1!Ii${N1( t`(41ݔu{' tBI *F+T4hiB$ Lc,-g!)fי";ҍ$^-v7>DJZ7)YέVHhQ #+@QcB.x ~R>\qXt-"S<`s-:2=+Pt3cB&õ r B($00hpV{tHbaJ{(qvߓ6~˙5>25LO}ɶ=AbU XLRX*tgXxrRr)U,\)*ٲ׋FOpˬD+ xI|_ ef+eY NpGqy=]Oc(@ ps Rxdn f9Vi 4fb5izNכ=d $%6oo⌐}H-ٌ5} 7Iyaqa *k>yf]ekq:\ [^^4>ˎ"2SֵJr:%ƏܜwI3ɇҪ,ݫz9ЬX̱x>ɗ] ZT%Ky:7FhI*hHpw~hX+;(JwEs_}6ӭ. K yg!5P4wH6:ZY0R=CzE:t.;2?}3N=5+2Kޕ\.ro uF<`T"уD'cK5 "38 G~& J5ZRbQ0BmѩO`1qmQ6VW]FUlpp0uR|Ig 5ͯ# F5\I1S^{-U\@pKGB*e֘fnFkD'mlE=i. 5u$4dm[3m%(&B?گ%n6 }"s /H<{.#p 358 !tW<]ߧgX-$m+/XBGВBLW}'2n&2y2aw߲y +]8~NkW^YPR~s\%y!D!eְYX1c2b=6MVZNs3^Sl :}r{5Uo;s';HHNS 0R[αmvO[7=m Eg;v71ΦSqdǕF|޿7 fOq[h~׾Eowg _͹ /-ڛ7ͣzGAD(o&~mF(xq!?`%J&3MinD{j[\\0>)78g|S1G EmrȰ, Ƹ#aREbHڈCcp;y衁cv~ч5 qûCo=|rJ9==Ks=+Ue~0~7DpT X /?__z|oc<)\' Xn g$4\6ڵk͝v#6J%xbdQY%-|ņ1%0~b KKN roDʵǵQoۻ}{*伽 d>Z7 G Vp).}`U`̙ szj(]i)l}5۩v6[͚Nb^=FnE|4 j-$* %^F@HHLBDN6tA~}FY1 )fe  D9{-!<0Gd1mHK/c=# #$$8XG!1z ",iP%LR#1s:ֆ8-{I%.%%h1Q<,/}P;V/\t |DK | IWJR!W"Y駱0bQG"`" IM-a$EV 1NW~j2wwʫVQg3=Le1Fe'g$9y-Xhm 4wFm6*GT KIwP+.Js䌞4ǚ3ջe=Ep5/m]9L{fkԺʹ$,`}l[Hɇ$H1 ;23#Rw|I3p|0"yh8w]1#_x1Qow@VֆY[Un2ǠiPDYj a\4H{UXB"ReNS86ہ( (WK 1`BLJ;fI*K)µ]ق׻y8 ͳ%W5l.݂UfZ]li$ƾ'z=qu)BbGَ)p|~Z,îa]60v1i#f̭&P BEC)m`KaaF :G@kr7-îHwD@C/* J3K`@7Id'1cPl2Rڌsj  'BHewXzdêRI<&ru=? OoGښl XEIq8qK [)>*FA Y RTAp#"*be~^Jav ~9}qIbU?З"撯/yD,Q\1Lh8wQY>0L`#䃙f,{qxSˏ w9ã}ȶ ڞ&G;  G#L9B'RP;1iۓ I|n(؀ IR[ ֟%. Y>Ò:?McQ#+(~ ٕ)ɜJ@W|y:HS 3XN\EUScKbIt {E T9L~64:+?n~,uo/^]YgaJ`.(%m^vQ6zV\ђ0zIڞ鲮Rۍ&ufyh\(`bǣUl4'#A{]L׺J Z:M' a!#cĥb֧g(KJQ<[`T/7o.?ޟa_î(uR7E/ @0;]盟jagׅ5pK_*?Ko.4K+U !CF>_]UlС E‚ Tk js-lФ_kvTK_ˀ@_xF6 ;mM4xI1f %A(Mpq0vO8K𳺃o"2 mQg$^!i:DL`5 cp!T+P0:}k:FZ j/Gr}s$lr|Q<b~<*RI ؇CUC `g`2#MJ9S &cհKy$"kY"; */R"i!i{nWQ}wQ!1֐RP{ )mevn=F<;C<۲Yina\l8mxc"'Jќdsk,Q?p2㛣)û'NjaژI_ 1{A64Zn>ux?J}| R=8qFp27nUÇUc:4ZU eWQx&K&UUdB9B#̣ƣJZfR]h FAG&#LTJeXoъ'_}+RWX}-d ;r)p&{틬tz0ЮW_7|mh ; O?etuT1JL>d+,o`2%N ]W({r'! 叹@]Yo#G+^^yD^6v1e !O5٤LRjˆFVOXTj@-*%z#'U,ˮsŲ\zbYQ pP516R AXg P4dsh X#*8=Zmz?$+UC[Ua,Z]$LgvUpÏn |AinGU ԿAۻAi]gB{zSix7ro[ EPs~R+8fT猻FnoVmЁFϗNtUhĨPgZVZ1 J48/fY'G UXrq0ا3K)}w̸> ¬Vh( p<( EHDⰺ:ύזTGgDi[5" M 畲xDl 𠼰%b .IRʧۛ=ǫR+@M䤎GvØb p`ۦ65 }MY#q_ܔ}qD$Gk]*JIQ'9wB8UNu`h-6l#ўƒѪ8wf= !z&5VR ~RLq:kȅQ>$Yg\YRp=cTZJ97g @>{}Z_K#6Fu *jFs)I8jBk(1(E5 2 ٱ@,pV(gZpmY &h.֪h8#<9㔓FcZA-?&򧝝uGI[$r@ȽAu HB| A"rCĨ9iidlfǫn&Hv0WF Qxd3G]'饧JS.>s-țg_#2ifqdpp)UA$X GirXߗ\ mHI 1Dͱ[/L(I onzIQ6)f{@Hhف.>cYPT2K|o&p'AD ι+YR16Tz6Rl${uGݽ1ɮCz -Cl; II@hKe &R5"ђ."A 1vV$GW7 ^{8KZ1/a~ܜf>˼(v|sZ]_!>O#Lp&!-MkA AJS ĩv!ă2L,h5T"=&b(i^;ݹty.Ϡ6Ǒ: ϻ?1l]l,}WoadY7߼J >P1@CfgY3),C.qT9*  nodA;[Nݜ(ix?AVO[Zbw7v~,G&w|zgn̎1;|9\;W^:83SObB҇j&NV$4lĆy0Ȇm2.thIN3krk.Nm:c-NH"i0Ə%6jSNNWۜMgp;,Yg O<[&nIYtma[OL*2ѓaǃf~`œiY @'z.gn){݊j9Pt;cކ' F?|"VN˵Z84_=`4Sk3~|;ؽ`RَU\PsfDm0VEm 'AQqʺޭ)0N2];M`.X`yE'a,\M1]nx泑lovܜnlç酪M OVƫ_]/&5OD-/g0E#BZfL;&ޤ 장*= 5賞9jm}`ڄ}]t;V YZY/I zJlFr}2ibX ~l?a:z43˱t}TuH)nyӀ!)R-,-"T'p™"1Tԝ!=&o24Jm;ms*稒khy:%D P)1Z:nc{Y{ڷ3$Mȸs2#-%_=m !HL."3_rq)78 8 ZpQEU:7W;.㥿Ǐ1 aG6q5ZfiG; GSٺR3,?GyVR gE#W.x-x!kF.uB`xo*X޶R=xIOW3M,'z0ZURK׫:XJf 5۫&~gz7N*x}R$q=lT`{ݍCe^c^PNy?= F?&mǻWaCEvPFT3)> g42ՅʇH-#pm > q-<x[4$'4XAQԀoG h IFbTYT ZK)m_R! :*='8Qc #8w$Y]qjRΊ>g{<">J&ijb (w[ %QM 5\L4X$E36my݁Utd[=V*{}=Zćw=}n-j7 rЪytŭt=aOF>Nέu釅KoGQ{v]E&3z^jƣюfwDztͣrXS}uOƏvKojٜ4g?kzhuh7{'U=Yb + AgRcܶv-z~u)Z1-Qh]E\C*&fDk-:ĠRp3F|Z͝:0O#<5)^&J ΒRi~- QO?˿](6J = P?%4̯p ٯp ~?BrquВY-<1Amd-68c2Ɉpv2~oSYy+RXRctի}{8s{;˴|f/xhU$X. &dٔm  "`8/L˸uR!l%Q9sz$#㑞g. s^8(p0`&) I8Pe(>8TGN_G' VbwDs8ok:m9̲./o.٫TB»Cj*LmƧm~A%gԓ:j}9hvmu1벞uy{ׅeYtuXFr5|khYի8$; O e׭^Cμ"$>MowG;NwEGߌaK柳, <$l]^yX-@GCQ,|HphW6yL0 eYA5zz"H}c_dNݳꚁ'Hp 2+%(ÈJd|Is[WycŸZDpJ훠j%/O\'|8cW%Zs$MZ)Cފ^{-THN AR-lB.lފr@Z2In$߯nBb0u͓_5(`04O ܧ'fh#9Y \*/IEMb`"|L+݄ X3;9i7aIHeӂTTz2SKu-2!^.$GdMp#Kc!3" @85I*ʃ2#༆(c>Cg(nk4c[uq)9vh0e+\-Kkx߼R7g8߭~, ,$ u\΢@zwT>nbgζpGݚ\(0認[-GDN03O-'R-)?#D"8n @DHf T3<UxÃyYEjGF"B31Ҝ‰ଶq. P M6p{By5:^T.%_u8t)"s&aN ?}X>͏ؑ{9: F7,j];T.gV.C"4xP.Q0ɌQ 2 h㸤Ґ nyܻ=<DxIާ^s-q.Riǂ*( mR(;< }GRM0(1HuR1I8DJ ?FXdgcfi[೬Jyb&ĦQs>Ͽ${ƫ]*W S.{Dn7TH)#JvQ9?F&>I_տ.ٝbB 4``V9Bـx'~ M7LƩ( *矞ԥ$gEgf*P{e4[\NڄB`a5I 6 Ts_>ME1G1ƍ1Uont>[rQCIA2\ .P}McAF=iR;7nO,5@& c(I"I"sD@A i+h#44h5S]PwG0/DǸC}xp# ['VGLzh*e+1jcp!T#P0wdߍv#ĕ嶈k{uGQUYXCE-O7E][tʏZvPkӡH5,0wEYBV Ƙcl2&ʒxɱ6$Qs-9˝Rtʻ$kHQ=#wi !҆[W!f$ lp`sVk/;^s%8-*{MN{\gZשJU7|`C64Q*/3,&cL3 g0ˮ:׍&җya88]$G<^!}/)٬1emLixHigAynQ65CoGpfd5<^rŅ T'Oa ,|zzT%[Ƣc\ {X @t-_X! +,;T,DvI6oU6N+C@w^ r\^M*x-x %`WYY:}^fVebd^o[^X9:^ڽX:.tRahni_=?>1T#Ŝ9s !) WwV=s J\ Yu8'-}lplj@#/7x3܇0k[zu̞Xd_Hk+idbdsAЭF$tV/pR/ڮudMnZ=e[&h350F<ݿX7KuPmxa4)fT,|ZuW @Oq66C=OawG[֎A}YtU ;2;nQ7`{䋦 m,%EtAwQ.ʻm=K};@ }!;-hWl_>Pygx(֝߂zJXwu|w=`+ijZZhs\߈t՞{Iۑ*ɛ'Aw~'*P)YڧzwB(L%nSу0)z)zU=HPj=(J)c{.ߤ9j9Sѫw٪ '`-cQ.?_0gR򯹝OEԍ/RӅE:M4YdN|QnDQ+HHNG4 "AeWX4.r^=?*i`5@I+0H:ƽ2$ GHYrP?F@`Q'Ch4^"MwoOJP-m)Vns'1 )!ZRxV(5h'B|C|:Oι epor-,NJ9)R EixY#tL!-1^ U^Y/b8cnH׮o EKIb&1;e%%$Dc#Qj4sɘ~'1Ҍ/3(.5p?ήRs|(wQer@KԂ:o%FQ K(8r(e6HAqN1yB5bb71/-XKA\ޫg{_b r ʉR)b:[>%A)h`TX+ ~BFuiM>/u29>X^!~;(ͪib`91"mn`9(učQPv;_cOW#-;֛Ȇe|0Xvxѿ.{M]7#_Idzf(D]w|PlEjN "PY[&|;:)(%&:)1x)c(ŤGd6 Mb K J>7jJUL@Yi}8q~QBszf=$tBɔd_D+T4h FHCFK˙eFHU`j}`pS$^CS㍤CO֛TsVA ILO!xՏn&̿JH:[ol9uriWt1IMniˈ&YeTaC_Ѡo Sy~dIrAm5D=RYFC b a 㰆8QiI6)8^+Inc(Ip'gG4D! MAtp`KVFoU::D#aTck4P7*gEC%DhVLZ"$"MTR#*#<#U (26,grW fĨmDz 6HA8F@M FJ\F.:hR.zELRY7Ȅ#i(T#NG RfrZmmXj5j1?⤇ݏT՛7XC,!&8Wu7?Ϟsw3`8 1.x`pl'ǘs ,V ô.FkFqLl>Qm,LC"í"&EԯuTLGڎx] 6T h ^ d*°V ֋y@ix'͉h|X1tPԬtT}ci2 R'yI^}g *ULT⣻z5:.A˯$M  ͠*9rn>˪4 | l4V)ejy߇?<\N%tH,eo HRf yIG0?RJh>;h}pOKfI$!;d b}7=Z_1  e8^sXe;_ CK8S]{V_oKjkB&zgaTg(NkgS> (f4 wWܮW߼GϣT Y[yglF[A ь:^\\0εOpwxV"s4YTAue5GRsZM{9kt/B`LE?ww-8C/NR3oLjrW7#^5y[TK4dZt~ԏh]#TJEx}hGsB!WၱXjT ʔV0b]0>]j+2'A9u Vg.G˫8-Uqfꓛ0ضL$Ob'k= tͭ1s^ }`%,YWP_QXyJ`hi)1H+h3;*!H$.I=@F} r+`C{E{=0Me|SbCVdX=]7gCVeE󫞓KfȒ@ZQ"NǤah-z룹4/EǸ.O810 .fdrH8+ʆJ~yWG:$*8ʝ7\ 3f4j|b^1Tk%i3spW^A5 R\$94?f_.)Y6gövX3[0#.pUr.&}@ epoJ,-Ċ9-Af{C_!f/FO?e&#QHYu2LwZ D佖&Z܊f[f-2sr<3͜gsd[ckn kB82ui#mnz^rZX7 trcךԻLh=\UG ba#?ۀTlJN[ch d:n;5#yfykn?2FnOʛn؛p1nr[x Tsl-+UwP®_)!?fE, dB(ǽ~SWDP7L~H9M(d҈6!X1{2K73#DТrIT>!x>r 8M(rn Ȣ`;\bVJHH!3؇n6KgDx~eÛmsVrnd̷Q0 1*o|8j.T2Li%#La%fJ"R#܂p:C@Ca*YX-D0 ][gt#&nznzյ\ Ҭ=G N:&Y$mNGCz?ϐiay9N`.0 "- T)0,VJ\rܹYOUW=/'YOÙt. T5ħ 0 ~9 o@o" W7ǫn~ :,;[}5 dO8z~{۞>-DІ)u3(,R`#k;sWW2yלy;; Y$\Hf+L=cuˇeO3ay۞½LTyMFT}Mfw~EwW L&AŇ(PV@ٕ0_M]M`ڜ3d_o|īڥ4-arl$A.7lɨ@%cڗRYZ`xؐt3pQ~2ߙo%#XO: #_YrѶnu8= EɂT\,5z VaEw`~Fji'QPHXG"\JlB Oc*5(K)µzqy3?Toчg.$]1'i [9# \ Kk|5q]bD7!3Owe9a,*MM:wLڈ fs 3TRA)mc55!̨Sg@h|ABWUrYPT}HwD@C+ J3KЈ`@ oځn\ V j]v=aCVtfZI s'.U}Utvp1n5cR6T*@&T65 7<qo?~<÷~~?7} mQg$^!i:DLmxC^0|v[M&6FЎ"Q}ܠF;E5f) 3qO;8W:s{s;'zÜ`sB7r8L0+a#;kH)EԒ)EvT(R\}^׳,=M&-+Ja^}68RX ɳQa~1Ȕ,'% [NTdrVj/'ָr2r~|0bQG"`"RSFDD b FрG`*c"ҽe˾W@3VǗ\?h9Kӹݿe,.:_YSfq)2//QӘG"h@QAsglrD`p'%vpsHc N7!yLI2] /9ˋ|ȈyH_flrs $: |0@PjqDŽ06XeZ :pm!"/@L~ 6-`ygeVgՁI]m}v hJu֟^ i듷{S*M&l\ 9YVpSpK`"{&CDtX{))%J7Pb]`)9n/Y|fEM avmaЂI`XMܭ/XԺl?3 ݶꄄmsi)!;FH3!Zyϭ]g&| k/i46v%a?'Oa1^ߗ`(Gig| ҿA#X0wq/r _>}O*{`]S&ҷ lp c+xB[z)X+'$BJJu**QYP(+fU"+ TUkqw((ׂW`)OF\%r8qJ.q0uB*,ɈDҧ"Zzh[pq,]+.8D"'#j%}WJ̊zJPBN؛KĔh. __/2,:ӏgKD3U*Sfc|{K=̅_aTYfΡw SЫb8j N$/-XKh̽L[@Ov >0ޘTWo<s~8_>h.・Bȅ! .SyCSU6hQ9bAv`P^9"(\ C#)&ȵ!k(B WAs#!KCcN+kDӗ)X& k }[v3NZ]I0٠`x{k QH)VJJ96X*fﳅLk6;o6_i]>YO`@ָ j=>4; 5L<=wz+܋ K`:F$㑅`"RSFDĠS(H8Hhy@9!W50me k7{שpf;g0st[Q8wvUt<1\wAd``g]WGV+``ⰍK]Ҷ9d\t;p-8T\xteZqDM2]WWwӔfc&LB>_qoԴ}8oacr<0Im鴮׊ ڪf}6oҰI[.kenN^SiNY'"rX'o҄W<ʛ&$;4G;ֱ1Zd9% q% CpgDeL(DSTYXL\o>LndR-k*/ ZfcuZk6^V낧*>~Wu^_ /Ə:7 Yqy{l۝J|͊g80Qmp3 +[Ҿ`?n2k|wUMn.L0ߺkk|/ ?.El1EVX7LF.30\ oo_m Bj7'|f:KU$T"_iY"M=dICw=m^#$Kʕ^I;H1A^T 3 XFH 3#/s>_8򂣾mڑaxJS+rvgJ{'1)#89e# I/ˁpoB95N()yaNV@i6(ZD43ˌ3q&xi!҄ΩU/xϖ tvef(lMmA>_SLy}E/Ol<}rY6F*hQ%1N*:D#aTck4P̠h$yV,5-DF4QAJp$TT!H% M3,9[of 5^K@ OZ )0 #(lwH$7 kgJ0 E<_E@Nܿor"D),p# cd4RfrZmmX1Znz'O:L~N:(0Ǹ±D"cz nJu4-U\C6)A^3a6^&f x" Fs.CИ nh<8G!,l0@]N8,݂o *;tAhTi) ^oi13*wstdAQ^tCлIR\9'!U=d}>{]266$RD^PVł6; @B&KNqwUHT]vUq=tms`&]moG+}䐑Eqٳ/]n _%F)eg8IqHJҤ=HtMwUtWW]QkDlIQN)93Zc&zGm6{SSN9|'g:^Ho=')(!$[C!f68R$#Z{"\TknOqZ\D&evۮ/D2ܗ8u ueN4yAָodWPvPJ!+T#f\p;i` :$CQx^/@Cvf`.`v w?>bż:k'ͱ6Z v#å1?lBz+g?(|L'v0y^[nv@f54N o{=&WOs>$Sc_CRR~uŃ T-/RL͉jV_ݺn,h}uS7)9{ĺ:;ui֮-q$'\Yd\baX̵Q@ V0b2MB*9Mޤ j7i{n|=Ncavi>!uaؙeIFYO)%ݓv{QzO=V ;{-r6p\i ,@䎋vj7 Ρ]7rp!p}afx lOO/m)3 3O -%& }UF;Hd5&v^nE}r`OC{E{ jLԅ%n5X/-Y9=OV <|8դZewk@Аq1$T H) L>`9g LS/B20ean>YLX4ȒW$(k|0%6Zw1!Mgr>"CQwsL(QmGsudXDcR6JZ 2o81qVG݁g<>mP!/Y)ˮ]\A#MA#%| >Z|jg./A!a~_.KD3*fcS{o-(lbZ8fqJXmH}[|XϦ0BS'n$ln2_Ų-}}vygPjzi S*'?{]q8vE,U!Ρw rjI 3OF`A#+Via;7 qfA4I(Jw[)r Y#YŠRJS`> oLO2Ȭ`}: apy6h-JR>RL+G bdF/cLᚘ῕5k6-al<7ql/oMۿvKۊFCD%$@tY$) Ӂi]IՄhgo"88fa2 @?#9S9s-!|`#5bd"9  &mhAmFRe4Ƞ-( Xkn9 S[UtF7 #z!Uxd!R&.Ӕ1тFQH8H7ؘ8k q͠~ddG͠i,.^vu*ƢMl|nnG=SGv Aç&YfiϋyLxTBnKPVcMc@c0AQgACNcF1)ǭ 4wFm Q1,%! jCMoLg})Ee=vfG\s-&~\(<#P?64/-If 3aT`: D3 z|[$A 4xJbeC>_B<ŶjBO,H%R#g iπ* ^;DȬvu'Q (WK 02`82 (0b,@Je)A@0+[ 8z]UW@ OqTaI q=tHǝӍ'- Qca;>3Y/eEpݘ‘{ zEtӓLFrg񴿔ʰq4Y=ѫ1|PAx se`?@Bsͼ=dg\[yuqr48[ SO6;\\}KCN-ގg7n`յUmK/n麮Rی&ieyC_(ԋi&UG\wFK^dm-ozɺVUBIô*I sP.=v+fi_nT^$oRSj0d$8WLB8z~XmDSzuya_qdϳP#w3,lІ[UpF2UL+140PlB f_24"Oj2yJܭ=HͽtZpPBQnEz㆓b%)1DMY%<@rFϵ,wJia6.VdIVJ5ǾzR[n( ^G.n0DR9P+LM3oN-Yx$BHLO Xyz҅&# [pO>A/dcGPUQQߝnyq>ymoc|6\ L~5޼ ;_߼LZN+m5So&xP`օdQ,3VLD7g^_NKs7b:kroG3\{ɓn`~=l4{W~T|E1eOW~-2 T TU]ͪat; Kx dgdQ2/K5D2w~]w8mODM=;izF }0ŗIYUҲ;#4+uAJ{mnNSRKg>W\s OΧXMr&\R4Ujg祧<ŤZo4̛GrZ5@#,¡6pd3(`> DCwMm;r cE06}Ԉc6@x-G|ggJ{'1!Iie9ֈI oBS0baWJ'0i#ZA۠FH"I`ci9I1VFQ,4H9V K':SѳL.-ʚ5iR|RpJ,V+k@@/Aa%/)|SO,|ujm62 *ǞGKH%SX 2x^H ;hO<`,o(2mqvUglH 3$/uW(^V;$ti+}A”DP80οzMpV0#)6gF\s6;˜B^G$JF~x.NY%_[Yb@+,u@hx7=|,$Wm8 )jSIYW5=KEAݫ9]M'7)7XH>m|2E4^JJ8%Rb]`ɓ I |' i^)~pWsUVVXM|25<.OU5F=N.G.KI ̗%˺:}a♅bm17,%//.L ,ƲMzԒ6ldKm:.jb¸r営;MJn'Y$[nbwCxV&{d6,Gkf4YNZo }.ߓ p3MDl9i)A/N <:6~;FMTQvǾ_3PO,x4 46I&ػ8]T WaI {o@.;t=h AIǸגZ\8gDeLC+Deo照c`R{k8E=)NEtnKBzy¯X Ժ'\ֻ;ocsߏXYϫ&a?I`߂|"P*%˗Plrnc\9\G;MX! 2~Tcxts׏?6[gրfY,X3(8USy әw[Bo*LwxJ``)$lV8mTDz {3:lfuR]*\ɝ_A=@My3 J7_>fmU32hrqvz7ft7Q]?^ZJ]P+w??9sRbrWwF5f%O5Rxc0R3R퇚gk=W;QL[msۖwYb|MkBrmyiyk YZat]G%%eɇb O2:w +0!Dǯ I)_]Zmʖ㮝lϷbU|}mnNqv;Na?qr^Է9W;Ql = f ~;7qA52^܀:ގUGdESJ_Au&QPR d#ɧ0&+nD\%k~@f͹_zPoեڄEǸ/ha78Ŝ9s˭#:F g}Ue2ѻ Hǰ-k`k.PHx3jg>LLj9+j$Riw!ǽ 2~3k3d7N2k$iC\3!+|Qi#>>~Fm@ ep`@xdn f9V́EA7Z\S( il]?ցG5Rk/AG x)P"bl8V{e)㌍!D!e`[0՘IDk1>l5ͭB6-UTyǤӶKHk4ÏՅ}QG7^q$OeOWef(p.p7dK26h.LgLevCדUx7 pscIZm4.10oN4tw1A 9ޟ6kJOY~鶥slF8&䈂/'kB6R: D;$zrLզ3ƒMg?(c_]Wk DnB}`9ӔFD Jm; ϸ3|v $: 53z V)wLch-Ah 2j>S1G Emԑ΢`;X+\$JOƶ*63h|ACM05SXs?e|q١6^gYlGmWpkBϮ0(b_-`r A" A9٘3[mɻ#M<.kfs @i:n"Hn"Z^ԛi|v@KLHxI͝4ẖy4y9 >,G]bT}60`TYfΡw sjI `1I0.ZZ/ IMƉ0!xT8!,LMJnR"Z/hm:E4\ʴUi-EBE؁Tdhݓ$RL@+rnK5Xh&x#32x5sf" ZNQsz?M; -M5&~ 7KTPE#K[u'ZK(Iz yI=R:" dwz[e_E.qVq`bpG~FrrZB`# K ("K!,=s*\˨ nqz>u+¥&LI ViǬQFYJy>%,KȳG[Ζ. S>?g~ݱiI尴n#hW.dqkӭmxqZS;1i#f̭&P BEC)mc55!̨Sg7@|BE:z_Ru Y5]*#RF$HYIafXTqXDIp{< |w+^: V^ZXc&Z qBHewXzd[x&LK:/Z$$8XN@- CJ᧹A Y R|K3uFi}NϿ0ZU܃8Z$5ҎEg/eq(&ɾ-*kUjgi]K}A|9^5=ْ1AEڈb'X1`)"'7B J1u'(µߟcP'ɽ(؀ ӒK R{ HϝOꐞ'. q>)u~@eA&}vcJE2P]ifiN93_Zu ,QF- {t>R%0GLY<3}XnVPv|9~ SO'sm߻hKsEjoeNf0%n? 1FҪq$Wt4 iF8<|Aa0,uP0Q1A&t1wy8͚HQlY% MG-!@e$ ́}>m+filX1c( BiR` F߭!ϛƾy4H)ʝuLqpx7:W)0MPDDB犉@G5@WFih<댺5a^CrAzTF ['VGLzh *e+1i=BrU͡O ȓgD7IѿO=:|<FNQ+0` IV` btZ<Ź`1X>cbYGH(򹖜N)3.S/ՂaTzqIsE)b+OG k3#Z{1[,lWKjZR/ Rm_Win]3v)jS !zw3NPFikLx sZD)3ln%9 ,̝u,b6@:w f;h/w2EviBIHyfYq6,&U`4Y)=V*L 鶖yF޳m$-~4w W4=8\`3"K$q߿YzXe٢m-r;ݞ1(tfGh;SoCVNZRۤw |4/vKכG_<tL2~R2ype$JEf(]thJ9bE ZP2z 1r96A8*ۃJˆ^FcD2Y+냉KM-a$EJTD;> Gָ|4z3'[ݜG{t>]6@|a˦~q*0t󥹘c=&ϿxLqf%8 IQ#,y@[K,5s@#,qӘG"hEjleF͝QFG1CT KIw`阃;v^g=Bgl4[ʭ#qYmSU]l6$Uln5uOҳ`$a`p%`h|9[J_8~>!~+E=׃K?+]mBCϿr1V#N=AIY&dQ,R@^Wj~We .]]I%ުVQ U#yWS)ܛ;aJrhKշ_WآKUjEM>=ar1kx o~~_ԒN/R[.i[Z@b4^J)ko _=?D\cfb׉k/flpf\U! Mp!ߗ@yZ?s ,殿ަ_V?m>z#EO;k'IvaaɸX"U^:44xإ(!+|ȃ`S[Cu~w\T^xA%C放BEm(ͤZ:/Zw[/~V);̱|}nݠiA)n0}_=V#G*ވP"JJPj WD)E#W*NUV]W%غY\@qEuF#qfc &p9q;/Rfq =1{#\싸JJ*ADW/P\q-{$`ވ.c0AK鮋BFW/G\ qrUpT׭8o˳tjy9󋝒_g!rFdRbJWt #0#B{H JN"Lfa:8&ShKt?JKmlk^`#5"#r`"&vF0@(&6(ltx9( >V.2BAqL T2xaa&(U.#S/8֊c! R^!͜I$"JAۡS(#EE\I(0h(Q= ͤ!pV  Q[GDrH1r)V(M*&|@@E1%mBND`؛$)$L!Ie I*C6RQ+"8TCpfT^#F,Dj" p㷵q{7߽޶!P[aO"ܡ|ms 3DkJF)A%#La%nh BgK2怉6IKBeRpNnK?LU`OaR~m#$\d/Er\d/ۿ5ۛIz) )c`yU?SLXNԊHC%x6>imw1.+ (O̚k.'|b.'{/t Ks\>1Os\>1Os\>1OskiYuw<ٌ 3:$` T2aT`@:>UWygcw'|hmA:c|{&3wGӠ$ Rq"Xg`*!cV;Pu'QCIVOӅIϥZ:Mmnzz.&-ӧ[OM-J`-cF̴2#eJ%0e Q壦954r0+o9PI`Z _@Sב E# ϜRafXTqXG<щ$V ʏA0uLR'aSJS3fτSkT ^B*KK$*3a"\ZI?ejiFR<0LB.`%!vSd"N1y.cĚ JeeМGyVyQnC13翃yJ]-D`%O`| bGb$ŷqQY>A|w|0.*ʦʓ{<*(b%ccSD')(8xB0uܛVÀ05iatH\!=wRpN)Rq$%r9\$ *It||49;?ZTRrwҡh,9գ1(|0:RuvX\{ftQ/.M>V_;_LN o0% 37ާTq(Rt12ޜ 0xj[I֕_Y2uMt2чQpT )5{gɚӬ_um=٪t<@>(1ȯX$>Ű>!ҤR4w=YoP.\~pަ?oNuoN>voRdl ¯ |S.]YY=7[c ՛y A-N͸HsQ^'iF2d ? @wVg:ƅkhC]!0EUw ]4dɺGnYR|%BS³pWܵSݵ6S-lI*1 %F&Mpz08UF_&OƾПAF=iR;wf|*  _QS$Za %%I$q(p^# xXmD͵f*uJ²b?wCa`iNJFNhí*8# ULֱW$bo{ .Bur5-ɷuߺc3AE1Qi?]F&1^ = xkC:`Z%ܧ>+B+J})xK&(Υu_`i]&ݧLHMFVzQ2li`q#y9R1w-R]IW',/"Bzw`4=OUcQSm`7Fd6 MbL.nCiJJpp^TA)ZA۠F Lj$i2#$ o:v0]rmN?`iA=7=[mG'J\)YFF*hP a@ F)͵& Gk%3vƹ%h5T qp#"X & :8K`'FTuQјSB) BX1iDF4QAJpBJ(2vg!︆Pc K-)0Lj(05A*x']sFPo ]45)fp# cdRfrZmmXV:SܤK䏳n r#us X <լ%dƨS)*a mnNwF]XԀ7 W0sLpC@`QO=(^E0pa0[j/#0,ˮ>%uF"e>1M7qdPAUJ= BK"LKAa%0nyhC0OH#-{̔ ϯv&*+ 'wP-w<֩/icud{R6CC8[H i@B][ B0ރd`MϣJB}kyt\\nqk[d-lu_ў5[l"Z Tڤց_Ϻ \Aws6*Rby"}4OI^-U֯&F vjm1N? :_d+f$r`m@ߐz_wPwZtN7%~ϮV٦jqo{Uh+[j4Y;Κz 2dIE*L< KI VDpCK,sKy,E2w7i %j .ǠdtgX F&1C;DplXP˖\ר8eg\13c>c.Y֙C[+3#3ږ/'*Z1WNJoN[jjs L]ȸ͠Ց&Y풔[rgIoY'M9@l' o Ʒy}=6O5$7襐ip Q-63qM&j\MǾZt;n3t*-l .}Wj07Cc(Jcގ'F__vz$C, q% Cp$U,G˘Qz˃ 'gߥh^y#w<զUO7ol _RD :)ЃN-_(UJ?MbiX,Qd eJ+gI1ɡۤ.Hm$9/gm7%˼ny2.NeS*$AՄ0"Ӥ\x{ čAEq1{g5X{f],S44#Y[nf;?i<[gتVdo5]h[ӵw gG+?9 Ĕ#"ᯠZSi}wc ςP'aFxFFR!D Uc 9$A(o4ٻ6$ewԏf& bB?%(R!)!)JP4({53驪j0Z'A5ɇ$,ɗᒕqJg).>{j1nd[ڐ6h:1:i5J$\(lTJ1БGT=!ʻW_xӷđ/yWm ?kN|ۻJ=Km1q,'H;W֮5+Ks? vv@ݠ ۾hp\]5Wwqqyk ܝc~m{Wwwy+t2rٻW..~'f-YV:7N&㛡֞z61+\>]i<,}t&/Cq=unqs׹F-d/롋::k;c+9V UVqY9e< 'i]C? x> i42?_e/\F yH\EE3B]HAYDkäef=϶ \|֞kTܸ&1VY Bϊa;FO$:Df43^x IJNӓhX^y\BIYym^H 8+̵>ײL>Q(+ВsA" Rk*>4Cʠv:;͵HR(IJRrΐ5E8!6foaa2wMZ;wD5 Oqr6[*@/|.` Iu#0+șg=j`LS @0X7Q>l}N\Ot䝟ytGl+>Eq2pꮜ(|ϖ Ey| ̰FKk _=&ϗy4z- ?Z#u,k ZT1@@S)%PxIHs-JH q9L寄^DuuP)l8YwVQE̓\Phq4|z̞ieI I0M4$#\2=. pN爙Ec7aYiclL9KM,jŸ6jkI )"ar4wab:阳1m**emLD.RPF#Kq.t%kZi=h {pP%r=u 5=y'>ߓ{~O=y+BR0۳a{Pسa{6lφٰ=gl؞ ۳a{6l3$^A* 'iyq^0JoRF$ɺ9S2>D=߷| Xc=߷kZ=߷ޟR89c)Eqzxߪ]=4*U.U>HSOsrjDx< y->S ٿGPCSkuqKh%=|nР&eUe! g.Ar+L dLQ}4<!Je2{D.K `]୷AJ:Pdq<'t;;4Vr7|}_yj0V4Ջil۵pyz77erj7D;kw4nKnT3Up4T!.H);!2ˬd wAj|M@GB.aP颩OW{4I0a(M6lܐF'E@-Ur^4\}잃cl*||:JY%{neMA; ^j#GzqNkcHdPު4"YieBYv9?Դ"Y *"!"M'N*HrN)t͙;ZpR@^U0:{טuBu?kq)+0pP07JDDq>|c)%9/BTrQQ9My]b0WC.3uL}{,= gr@@pK b3.88gFKe<lNFGZ[]'  TȌuٙnƣ:~:>>*Ry>0 qPl)^}mHD#f~%zj/šJ*p>ĝ&h1W2/h8)%okϴz\>Go9>dq:8Z:s? y4<>iV] כv=pkk֖@mn鬭ڌuayOpVɧ˽ɸ?<taщۏ]?l_w7Oe]YL4,漧FbT=]]Ol%$^J SpRe.H"1c[DU6xqd6$xNEoy 3#lFύBE-=)Uwun ވmz<1V^~}UeOTui+zoѶsǏ&TfQo\`*Nا JCc>}Do[}**pBh7v"B,j1&K6mQJtmp9!@ He^Ɂcg}:N2غ^LaʖeR I JG>$`$\J|䎼-K4#)&4"T3-/N{=;7Ni&#|h%2`=FX '+IZ:3ϲFH$- qLEXJX:Ύzk!lT:/Kx44p,chVV%@TpȊZzٻƑ+W?mҳ6 B$a\OpO `u^O邋4/p/}C7Cn'22,o~N0r5z/QK1>.MaT΋ ĺ P2*CEpWW`Ou(@2 l0Rs뺖C{48_E^,˧&_=% NA̕s] ~\|̯U͋j t\u`imw1٧+r;kiqfOǾ5}3yVܷhzfym﷭BD?0.pYS?nr )j*Rg-FFy]+BzG¯gsqִ ,s+8iX,Q` e +gIq|Ԯ$}v_ݷsvߢ˫ea>O߰S]Ƒj76a[e6eVXþ#M`߳5\{xf39`y `ha)1HQߩP߫x6G Z%Ì2 "8FR!D`Uc 9$A(o4GZM5ЉӴܥ|-vB9?}@kJ4!GIn~ao֐ "̉^`#5"#rsĄ&,{]7ꧩ0}>hv}愬~(܏0:c&nq"n.Oiy_48G6n30L6לݞX<np/27v2^?׶Z-XyfݳwM̿gOMg~mCr=u^:KNylq=xY-3T+vNI-q,`BqL ͥAO}#м>4;CNJEGpfTjCgNHM m%TVZGCJ9ō8`1wa4xPƽu(u6Bs4fKJ/`]˖gX'WnT]Q7r/գ(mF32PpR'OqB{ɾ`ENKuj~^by^n%+^z %8uU\P1꺺JTҫ\Qg JΣDPbE>#u3|>*5JԲΫD%GTW`LD]%r5;u]WW@%CWWP]qA -6rD8ZDdJI8#u+y>*乨+V#uuĽ3!Օ@>Il8JJ3v}:8e 5ݨ dr| @o5_ "qOfxwѻ͚eylD L?SCؗ#Fz<)gCW.d% ) Ƙclz7qF׶`ޔ"_hY"M0/KQ^UK ?~( Q A@2a0x{1XTd1`ČUafE$P#} "/鱓DyPRNյP>C@܋LK@OoL0F20~J̨ʧdY܌ ໫f(e/33p%=+ΨΞXb:Mh1 ʭ,0`Y>jčQОrtyEt=JWu%cOZsr܏ZR@)F1=J;&4(4f@AJ'0F+T4hiB$ LYfRC ll@I:3'M?2^2@yK*r"t )u/ͼt),MZnҜ>өS7i\?Q2v)d%rp5RYFC b a F)͵Jo@ʁGDVl[;ޟrvICDXj.@#˧0XiDR6FcNA uV93( Na`K$"MTR#*#<#UR aEFr嬒@̚N," b)rlwH$7 kwJ0ҠE_d"^V;󿋘 f'Ȅ#i(T# .KuIkV`a=l%ǽtZ~N9(0ǸSYD$ c n hyGڮg㉽ڮ+r]_wގw!\`42z`ERa3{j|Gh",ZzJ" ,|]62 *ǞGKH%SX 2x^k!'ZX-#x,آq#)啎z z_&̨J1 )!zZUH)N5˳=0>ajs5ΑZ>454x:+uf0!2E4^JJ8%Rb]d`);$ Ȫ5[~(;sMt-hM<)J=lxCjpg!r'S߽d^0{<`xֽ7^&^sһZf'mDʤCiDf9t\z2ZKZne t䶮S0Xqrn}%l>-LS/eaJ¯G&6˟7PTgar˜Ij[o3G&)UZqܲ@(e2wzRD<5渾a5ūy'/TZ2։G_zח |PPTq8f8ɰ7!FK U0H:ƽ2$ <G%-cF!&o՚)[`w) ?o^-Ay`6^XpϯyԮ-=n ]M{ y{yx$Iv@Dwi`YqCRɲPĴx2p(-GpHb\p!u"8 {ZEٰ܁}ܘ<3I[~RAn*' 9"EiyP=/oXVe_*P8ICQ(J!$H!S'\1tJ`U Jn)`-cc䬶QNlacX[(6l_(f>䚩둟_zݧ/"b !L̮ pg٢q'!(fJmc"^&|[vZ;=a&4, wZP@wat<:BKtօ$AEemv'|XcߑD*@hpjT,eEry E|G @!@.&tfAմVZMۦmA)pr Z,+MQ: z43(qxrcQ<ظ)[6 \\: gZjK yn/G_Gҷ\ HmW ]*b?,[[Ct&OQD"8n @Dg T3<UxÃyUYt ̈́RH#BwxpV[8QȄQ8ۻ M / 5FΚ-39ߛ}ݰnI Z᫡ۮ8=X2Z25f]+|g\g!UAD83F3 ,WTDJC2܂'IpVB|A@{WqˠN(r4v;wsw2m$Jk8#thr'$x(^jxZPz"+Pޏך3JD[?JC-J9%>qGQrNT] ?GŞ4c['|V܈B(rnN`9kd۾mM8*Y♥2Z.I:L 'DFkF-N^fSO0ǃ@$SBe80(3Y`jT~6pIc䬙R5tɡڠ' w|bRtu}և^ϫk>%3_ZU!мCJ*&P%"elx'G%pPcmA"9P*h68XV[\ )ZcKnQ'a n|/Z֩.wxv`m>EUޯh=xD 6`r%!-Sl2 "&zBZգmFTwjD$raR[s0jލz엹R0W5W}U&XЃ1W\\ejws45W\1a+$X8s2J컹TjӚghnf\e:sPRޛL%Uz (sH+$X v0*+PUV}7W5W\ Pd2OP)s5ZC龛L%kcW\I!d`t\erL nJ*ճ4Wuw{UwF,~XV=FFUg%~|DY_arӝ9F*NYP|r`'wDɪ׼>u//dSєϓT|_L"`, `gj Nʌʳ:_~ŗ5\R 5B)emahBp,B (!r <>HH.YXndT+i QZ\M("@;tvAd+E:b1Fj7ݕpOh;m{Sjmyŗ[<1K70y 8#1Nɋ ]YBqP FE*j- y 84Oޕ$B[V~l7ЍƳ1<"-)RMRտ~#HI<$%EJ.%V*"Si*X{o1lڊuMd}j\> }1EnM,Rc%]."w21<(n#-y@aSVSѓa2^#e +maaaaNʕ8ڸrieĬ@X VM73a't*uXA/TT95 ^]Vz2n-b&R}#p*y'$cHjD>7)O2 p%cPv>3+rD~'8}9K"|8(~ЃqLvO12laR8dɸύi&ϘN 681:i5J$\(lTJ1БGX=1o杣W* gjp:a\ ]r~s0Eh ͦg-OA\TpMJ":*l$^gI/ݣ(UL`L'(Kx:gDY,ZI笑9uuR+{y7/vHc}}SiS2߶Ul A#A=x(m@$@f C-D:`Yvˍ>$bPr1+'CK" |2e3^qy!lCzh=VYXf12$*8Ef)99!JsNz6AWզgnl~ yu[~T ؿg{Or&:ͥ/@h{ץvA_1٭Z7Ժ]iZos8A?lͶCjPfNwm{|xws=z^iJz-nWnּ͓sٙo]N-\YuQ,vb+l[8ݴyN ŗ<NgҡMFuts! y W |Ƚ>kM米Fh3<7hsl@AjqxOvPWtuBJ: %l0#!h-N<{O EڅDjY.qkU<$ܢә!U.$Rj^;%&-se%tv+s u ෲuwv83l鋂 Sv0L0u M ëG«RjnJx5 '3Jn4Mj>o+K 8)m <̮;ړxgO:y{R \@&R6kZ/#%^`VQEڇbJ AN'ԭ(BJtJ2p!k:[BY`Km̵]˫Mg7T% ݬIWP@ܴ͂Nqf:t8oǙD7r|tHJ;=uC.0..x[!ޘD."ʃnLgF' [s~nEO_q v(xs{ֆ[>OqOqʯp_LwԂ9˚i0}cdhQM!N@$!ϵ(: *6f`N'ӄ"l%x)G@S3B*i-)A` \m:Y?یA8|61Vz:Y bD9!fH&mR:qzw=}L]'&wP ȉf :қQ#gFd]м8tB4U%&C`%Y[DbN8R<&3xIs-= |&!U[2VnɸEVB]YNTO-憼#{O'q͟5h`0:?[$vc 'dB"Ѥ6QCАpdĘuYCp:G,[C'uN<ò`Rp )J,ټI[4OJ,v㱎^9L h $wY. #>% 0P9Ēm0tqZia`/œByEt0GrLJz ?CZe=tx)e=l:eNve=òFv{8>[Q𓣚 >45.5>HOsnq86:C)=r~t APW[|6_[ ޏ~ؙGy:jUe!TT\VI2p]t2EBԭ}tRY C2X!xmRN(3dɱmզޒO&e)_X̤:ZPqܔ hxJ%Qy?}:Z\rb-S2W SIV;SvCdYvA|&&q@]@ uhd.iYusCT Qf'1cKP%$TnnOǣघUc3~ڙT[R #AM2 Dog #z;jiF ڈf#t; 9$)/SY!.#=S;\J+hJY?`o?/o\7{?HaY܈|ioŒc$Is_?F&tm3_w\OګLrgz`}ӓ >n)~R. t&8㒉7'2 ^ɔfl =eaxݻ V "}{>iQ#WVu6LĽZѐ[hn +:+΋c j)̇ښ{*Bu=}|UpvdO<:t7{rQD6jVyw`lHn #]lFl qUbyOp8je`rFOn_[Gbr㨂muMnZmcQ'( )+"Ʊt}5!{ŀwhf5#JnTMדּpDׯ?o?TG.w}?ҩ'ځ)Rzgx 'L CuVǥoߪz]}q֟wdo-p{#gٻT0OhD{3E[8okmbCB V]`qe%oǕ@H!6 TW>lՑ󕱺 jF9Fh;-p(?l_@k7!+2SY+U4NeeO.J3G߷RǑp%$%E2*eA#C)9h_yKt픺YH¦e?NC̥YEy@CΫ dD gdEަ3BU1BMpzuy*!hQ!歰5C(=*wcMQ'#mi'zk_ ع=HӐ'g-F64hLǗ\#yZ2,P4 YjQDkbU]Qƻ/i>SBH $K 2ytbp.ifLb:9!T ]'> uW fE:J28Zܙ/&оP>[Bt< MA4,# ЕzZ#ݦ j% P~(640: ^io4tKڸ 8 f /cm0`cϋP`[KUF$d15 ҁՀtCl/E ]Ѭ-%%7ٳ(d3N 4HOg*rMgOq]SjOcq5>'qwbyݮk3?oc|i͐x 9F(oUb(dXN&.cv)F"Y떽lN\;IlHz@bF،#ZzbզYx4Mxg/w~ݯSm+:8 S]7#8Uʞ-nL \5!4AihRvLy,˞sAZNEz\1!%g! ϘE F䤴`O!ȹu5DmŘ(%68ƜF f3AA@_6D6=jB =%wsQ% Η~ƍW.bC{TnRd31()1In@t)=;RS.idFLΚsqnCf\r^w|v&13^,3M> -F:ў)RH;$qge/B;Ɖ[(C qLE9XJ杬gզ'^\xZ,BǧH.tܠa{fIF*]NgkRJw,P =?f.}ϠH%+U{Nr_8 !hBp9:ԪtpJ;=á=9~qB U6 $ ]47B6F[ڃwmm$zIH}0wsl$%WG) Tυ( IKMۀ%q]SU]U]Q{U'r~`*"B$^he3 G= L,v>PK)}-ij59/%> [?s )vJ T*xΙTҽ+} hAhӤWj1s1}23)`ҏN@oy狖Iygzי(o]j4v3m®Aҵ[A<֕-ֱ/*L~6o*0KM,Yydc"LDX,-,x*X!7i/* +}ǟg`N_-h$ ݪ`C?gU:p||3M{S{SH)wSK3 B^ڷ4'MRgʷ}dm2k꟯;>\E!\%\L׏"isM@߃N" OVP[jGQ)RyH\Ԕ D4VgPµՋ-rUSg'7?pa x@ `jMжLbZ~M,Anhs5oMAq{t8^TwwpY߆̯Y|awo~55ڮ/UGj r: zy:h?^bU/_ __ /_ /V /MH_ /bi巴4R_ /\^ S *f_P_ r_ /Ŋ*_ /%վK\QF25hBŕeNF]<(lbRcW6(SY8S39[$(_f9>qgm,Oפv٨>U6_.b6lc+{b=uDhgѭrtO6q"gk(ZYJL%0" b?qğ83(7 |f6 !c!%Zy Tg=*,-Љ'?( |տ[pm9ݭZ}LB$HD /KaHGʑsD99bagED2>Fx[[]phڧxrۼf{˹Ǿ-37ݙ1BԾsjNEǸN8WX1teU$XLjAG盂X/.#X\(wހ(zƌƁZ 1MWL#,8ZI}jw9O/'þW[]e׋G5ynߢ [M=Z]L̋l ]]O.u'j͍j+D%}NU&BȀ.&o}aWՖZWM5ەjsMu]K-t2Y_wws d[9!/goڜj/{n7&\P)*/CPGq:j28)%UPU3_)i4ZIߑIg{=r^8JEGӘBP!<{Mg09$2hX)Ql=N;[B7zf֡tv1m Ak /.;oyVά^XWlpfPLDO35֛ fiو)s٫\cWۨI̓f=KF$ _L}zs"8wA,Jl;>\E!\|cn}u:%qʧozRe m66ypo$/.!$WV<]s:Qct~y_owt/;hv;x(c`A,DjomW דTzy:h͕?^s~iJ+#!XmvȊgnb¢A)AY_z V)wLcί\ȐճcB)7jG V Q ATX1[m:K妒صX7U-SX&MFX>RaR^V+k_ND~9yJ>,g_4ʮ?G@&( d  *pg 21b8j $tX'&H D 9N`.0 IB9U P˽).9E4Nk@NW8< Jq||_NΪ7#p-7ú0NcN@1N啶t[dV(GkQIJH1\vr4z&x#32x5sf"(ܚ195z+Yta6x.u!/p- W)jܲ2bv=s-4h42>}k(ލe +md{/#rL(,-1VRywp]HQk %ImR!`V0/CJGt`ZDLr"gƶ_r.Ekg㎇jmYkCUE.qVqy \GC9Ji`#nQ 6T"c+ ( !8B35,;zHLdu 9pN`^پ)V$H.%EK"#ƋyaurVJRd%e:DY=Yc$(aztcTt<<)\lŨ@cWRYY`xXs2x}y };̌c@,*}mĬ<iu;,%7}'YKFX`=s* H-Q;M:dޓirE؄ A2 ǨU1kQRk R5,s@89[-|o%J쾚>vWa*dNۇ ]rMwLڈfs 3TPQP}(&Xc죦954r_uP,ͬ(h|*#R6)Od}hVEwZI%HD0 AP$E5{AcP`?CShWJs vτ>mѴ$|2.0xqZ,K/Q{xTτ\pi]aganiAR<0]*JBĉ[Jx"N>c&47HaA0#0ˊ lg{ [ncJQL_K^ၽ|j!|o.RD4/5vCa@ 5<5J0O)3wp7ݩcO佽CS>(b+d; B#L9DRP;4Ya骽VÀp5!IsT~r5n20!Cw/}(|}b|3/u<\.MHY}}Z>ܪTQ 3~4V`Ȯ , Lç+X+NƔ9ɏ[Զ4S>_;o?^ {aJ18gR}3K"`/WܖgfQbUH?E0wMt0чQ0iJ>݌9<Q Z?j}=iCQgi!t2>L]j+RDنRRTOop'?OoN~ &_^|x#:F#AG ߮nJ/W8? g.e\w)aD Q ZH{f Pw goB45!0 ݄P6 NPwѲ6Ҝ3_Wǥ0 Q`C< v nYn[Owت͖̩j〡$uIv;1j`w'oc쏑Y2..hRIjV:Dx-w2snG<yZW[Ow/Ln4+6̙E@:?|-'&Uͽ" >\0 qr=< 3`-X-Z cGW9Ō2b-kG_׾xl!}Cu4ifُoxY\if#\v-"oƎz]n`\ \Yys|,#y:s{wY35Ʀfhf h'uLrQXW==-Hݽsе<,CRBTó[ Ӝv7**RCW!<>劎ƨ2:ּk94Y<5Yu^"Q D{x<;~;Th!X/O!Ec}JSR\&qsGUNȪs#DE*Qh:xѬEkAޙ1OXem@ii fJc8`OJ=i^:gIhCR [AioeM1jD䒳"ac5V@&mY)dSH! Qc(+٥$ِ"L?QD|S(!3)4\BnxHtv+ **T(:Y-Z CshR+oݤnD)R@WMėyTK*TzC%ǀ:3XQ%@ qc >BAN2MZl{R AEMghdDC0GRF5!&mY;KPA7/uT()%NilB*vNeW'U]%A/k.6v^%d0OAou$$ڗEDjyT"8J(46B je| A$zuH*(_CU+cᛯyk& ԙ:!nnce+&("FIC1"De pJLD0,|v=r;MLJ6!K^J.G[-^U}똂[$;ƍ5MjۍmPԒ"4D(2 !QiIQp¸GƁ>SQH\th"$Qkusox$ ې, x&cQչI,TGW>^_Ÿ3rpۊjLj&8x*@ÆVby\E4Χ,}F_b=^!ExB.`ƟAjy1i"!p9X]%\H("Ft3 u $$B jcJII9QvҹKXu &vnj.|0jF z,npZZ"/x(z2Б%Ysڍl: I,Di9DHT @QFp(yY ZxZ-秝4Sk,87SAj+v'V6B ZxXA¤aG9L0q/ZDXU_Mth*䤵t1>옄CPF&T$ 1DЅKOp%G,X2~Y׵WW4<o"B [!  R GV6ڋ &4@C>O8t~W?8=ynߣjȨ@jEEfsy\4=Q_9(=A@ဋ\}0bgOJz$P.j4@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@LXHɞ\R~Cus9[.=V1 I ( GL1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 rI , z@ևgO $K$bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIKQ@|H$P_u5C³!`uϞIHY!~&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &zH/cZ_gޝW͆/εz,?.ؾsHUK0ʃԑx .]zWd9|4è[ pt.SNOzza]?_Wg`JPʡojBT4T "odogk` Ѫy>=Y\߮_l^%zBI8Ȃ"H  uZwooofAٛegwi_glO/uXЕ.X ~cJWP%a2"U?7e Ïoǭ^^_WWbBmajcu(lmo{*ij[}"AW(YEMxi)-j Fz?^mJ׳?WJ.fxe tUZEi 7{og'^-fYȱߔMFCEcX)ii߼}N%AnEַ\13f*\8 /[_Կ.jQ}]#y^=J4|zcΎPbl=^{ ?| #.CJRo,WAll%@$8;Fd(k>Y';G?.|;k 0|1Ot`8s;s {MOt47G$ŏ֦2PE#S/?I*LfzKD3^U&1̂%LBx{2 ~%*J:3.8AᛚzŘ:f[g'GA). r;|vZݳwBG`X-B.ʼc=`\0L/M- 8_J1 \.ʅ|݅MxqL |K1ts 9's2fGY>N>-Sć`Jl" u(W߻\fK0Ony>l̶Ϝqp,G* 1 s!.K B3"c\I%c)IEZ"BJL yf={aI^_mT:LL|=8K2*`+,:Q&a6ƒ ,ӳj| Q|;~?q]-7}_\⊇6 ^1·7+z9wM Jڸ;L ,ve>PWl6ܝ"dkAS'{L6/I*tPc-+Vq psY|JEJN>|g9:Ce4ʒ|栆aBU.*(IgH>Z^mXLw|o>?9XYuhpaq.FGt~Elb2+xy;\P!31NIgwE<-mad4uNܘ\\-}5Ȁmv|'sʾszv\UծMn븻]븫O1p6w (77EiH_O,Žmls%#%u9`侦?e<:\VB5UeZpڸ+$XgqsoU40INT %-\ ߎ6T*;-{P)Φ#[зzb0hwM~)25E0(UND/ ޱPmLl)&TĤƊ?R'[;&D&g UFָ :@(BKt4y> #RgQ &l=+[WB'fgrsPOxSPŽ <&?\׭߹pHI|ׁσ@u8fn!"^>>{g5۲УeMcE)fm PRRk󩦆954#s.{VcPYY 7W􈄕C fτSkT XLbZ^ $ ru-ce6$k X8qK OSocM;x4Xs|%̲VGÅ&jg %~/QܱnG-$VښVb,=a@tK30kܧ(C7Cy<w9{R>{s<,GYr&WL~aX3SDNc^;5qixV gN[ly !jr5V"‚tevFx(DX|V痁W9n;C)b`e_ &ó1W䩐%{Bgcv5Ӭ~ Z?j9ɺQ*aP585>t~ q%+>ŝA&JSW}Ǘ_ʁLq]-M-B T^lɸ)׌T<כ ;~_lcq HhpN%40!4*c F_@_ֽ.e~V;GJQ&ׇfUiwi:&PE:NHúr,6"HCZ3:u}:Rj$xBqiN.} pF ['V*8# ULޱW$v>׾j%8Lٷn=`1X8As]v9G~s$Q`ö˫#ހF`-M:^ma⭈sEWEҜ6Bx58{-@,q\?xP؂?18@aS ƟZ{1*5X*2VwvQ7Nwr+\l`~^ 8juU93 T * bqFZMਕT}ªբe݋ֳ] VJ}Zk~k8gQBZd\m9W4 =3`(NNո^|RRL|t+}F+8(t6c.e>N_9.ۂƒ}z] e@Ӳ _8~.8[ 4)z6K)o9@"ɜ%w;+U^x[gI3lާN_/GynM7t[u%h׷x̝lu*͍Z36|*G`tqc[BmHW]B6';AC.o}ENrw5Ō%Ym;T7cu'_FVœW\hW7g:Ov#?hx{swf[q\nVڜm۔PBN ǽݜmr_:ls̈my9~=&|l[0C@c U*P+T\VVǼwQjύt;#]푑HG&y9{֑gǣeBSaЎA[%j섍1ry>rl >}`%iKQܲޚ8qי^\.܊^b+Ύp&`<7M]/v4 iM3;hZ%1_}!*@K3!d즣*2XN+S W!VLU>&F*Eb".ڭS7.;z!v$cOg=3gIu>h 1FT[ $">W yы;V9lro<;˵DH &Cr\߻vη^ϻq|F\߭?v>O^G~gL%j߇:*ũD@YKT@XQN [KA2p\ Qhy' YL2JYSLNxvm-qw:ZB+>dkMO[:a람y='w 7DP6Ӻ8/v@"IgbN:7y0#Z9a)Nng&ǘg%@V#, Is8 ;" Τs.H׳4X[BaN7$Ę]'vu[w σ?3bg#4O<t!k#-d`*q^(IVPOl7DIT? ) lJm!aI(2,&:Dbl[g3b0ڭicQ۴ںCM*o' Nؒ3+<|>8.Hp,+oyv3bjY48d*bbAKFG8,Qk;7|klڨ_*7ҏ""!b4YǴ2[-. xda'@FCPRUkpv[Lfϴa;-FQ݁'^ȑ Ecdq8łD|ik̰r1AVtpg]:W5ƻM{)x. ;E8zD#jT/G+1iaR{Jq!rAZ"Jl3p!k %BC_\mgpoMTg gPIU <7󟧧g]8gg17GG1_bj="Sx(HZ+Rfz|֠WE``ʇiw") S\}3pez{Đ Վ&R/ WW˗IkeARJgՎvpء  8*ru(pU{WEʕ ^\ J*+<*js(pUEpUWW-X~@pE+WE\u0pU5bH1ኰC2 `ઈ١U}+]FRڃ"WE\<*V;Ii•֜l_M2ޛa/K/|] uHϽ05ӏ}w:M0Yg)W4D+Ý!|⤴1e.P;MCjt_{%6HQ!hZaWCEZe)W<;z8! kۯ)`XL  e w)q!X`@zC^JtLBY l,Ѳ-Gض&Ζ z7K|}b"I?TցWTkqysڕ& 9bRm2 F r#1f+o8iBIK.'5l8H'osDI]0$uQ̰1 %s*L] 98hAsC3"9IKUf9}/{|}M4c%t< jJғu;mM,heR B8A/b& H*]VZ:/[坶?ҴDBb$!qFh9餼*虒LL`*ǐkAHDzLFZԖj F~I~3rǥ]I!yBx_q"D\IERvDNӴ7\UƉ].d\B۲3g-ŕz>x>$O[dɝ1)Ypđ؂ θd-%M&MGi`w,9 zxJޥRsI ŵ|7~釓yٓEiXp?-{痑)Ώ] KWɼg-~[-`FMJ*Pu<LVwQj1׊2N_ZqVRPO |xݽ%E ZosvWӳW=j}z6[iveTmU37\v cM-Yll -]45#66,i"4E=Xz뢣'm/5/G0ت`[]dSMjزIIs4H}>קW¯+tfXy5դ=?]_~ɇ?|ɿ>|<Oǟ~U/ha8jœEONxkC?ma J՗$_tF%5/L`0OHrҽفE[mEHcks,jMWnhە_+@5 q}g$܍rkmla-lNU{DZplAF߭n4} T2U!J7n֏?$_Q*$${eJ,dP A'4c0W־f9aSJt|9-ahWY$QdFArDxmoGxlKv-jhc%^iNQ=e{9 |D2gI?)xkfe 2hX\S \(hvVBjk@7gi؛}e5LS6zM\Uigcpy@_mxG˼ ǖA&?vUiP+Tl6P et~T/D [ 6<_))JOtb9-}R$00+qWJ w^T,W`;){~z[O&PwV c1ۊrps8!GZ ҰviӴ7גOC7!4Ȍo&u p\n5 cls [Մ4bWXk̾5/Ca!®j׶H g4Y5wytb4~lFj!*тUB|T9( N@r+R&;:91蒞Fymv)TV%5ģj]2j *Cʒe tj@$Tt.39zkl2.TweˑW|s M ?؎ o/$(ZT5E[X*)K%Df |~HRKfdt=9]cvaۥQ=%ޜ,ּzzErBIK`idOhk*9ʪl&Qc8'*E9iSBq0rr#JI܌Q`ޢU ;[p7q6[84=\]Uw׎־>vc>fEL1 *cd20Wd)oiKp\+0hWhԤWn#sяɹИ:ꅲ|*m 0ʥ8 ToL-λ !;8PꓩM!hcK$#[ˉ8RR* `@A~roMCJ7]d}9jgۃW 9͖OOş*iD?$AL(c1D/-*:I8Kŋ%#SFfIKzw<'U鐜klIeVZY v6i¤+L] UE5>)-:B 2L2CHM[U9Dd;Y7që3YrPH,7shXeѨc&xRt*ASU/ AFGiEz ?gU[c >R֑,eQ@^kN#!.EGQrhs) ;6m H4Encα P)UTn8*E}5x1AB/)9i-{%-(`CJQ[9R;""[#1i>Rg[oa2]\%cj } LkPWɻPLJ^%<<CnB밪ukNS5':|~"2̆ !e$Z2!aL6SIR~e ?D(t+˳t[[-Fvvvcz^-?..&GFp~Vl}z*7:M9k1TiݏrR,x nqPS:A,`jhŰ D}5YNFk+P.Z#b|0z2racvKXZ ܱH򼕧- VŨ}a2S3Ō-Mn:W8M~_s%Xp6ON_04!z__.r)租/<37Ď\do#'vC;،`^yk% ` G뼸@c|9ͥ@Qʶ-Nz:!U7x׳~-UG`xꈻ^FLj8SQW/񉓶* zf/-A계_Lް97NSc5%r|=[$ TS9s |=骀9u}wރJ@xYаbᐷ;`s>rc|yigXꂗ,4̙Іfpi8-i@UPUQq2 @]AXG瑎Cp)r.L-ŦkLTmΦeCƽ[ͫפ7`iRm :]]?k{XG'T;B.58'=08)8@Ua a,dT#\m }ɲ} RbW+Ώp`a>+0L014=1#^^Mں4|>; :@ +:!@c4rO9z"xNV*~yrgWb+^̎g:|(u pPCFB$IJZ/ 3'+eKK:yv5.O'+]vA [m Ю.jgY/]_K !q~w E\^RUl$P5AΊ#L@r& tPM P{6hq K1 ɨISE̮F+v03hXgc AܛǮ8MO'32YBy4K@g<7 <0Ew3䈵hJWdb(=_L,1B+D9t>ZJ%RXF\"$ K0p[VqR=c7q6{(e/fξЍpmn#$H[ALOc?0՟oc` 8O*d`.%T!Dnd@+;cTrr-b%24i}C &PĐ|,bx8=6]]xnR:{m?z椈m6IxG³dHT>h#f VWdRY3EA"d2 Egˍm^2%hDTWTbgMp/i c<M?^cgG8z.e,)@Vv`um `js$vA< ,ǜ%5$Ӑ=a]mGU?ZhioiGma;0L(GqTwd5Ƽ=ꐌ}?뛦^x2ťEh '? B2'FVzr=vV4)oǧTގFK\w|,GoN>ݜ\]fėmL#oNVptj|OmlxݯGKLŲj&"U"hXfM>=, {lf䴳FFCm`u4[mMr:T3LhR /EJ/Fz ~g9sp>yl<ab~-mWYe+YVFk5h;2*phBJRW[Y[ҷC+ )R !^⮓O87s ~}8iZATZ(VJDɖhk Ckx^ W(|&y.4J!N!Pp‣!fՎseL۾CK`H9 >ܼЯ[ٿKm2%C]'Yv9]e=ǛHyxH(ޤ2xddxʱ"LWIi 6 ъR- B>@O}< s;qgqʸb:JX9yja݌ArQIQoG3R*XTEpUbMB |O[mMhN>#I1xN̉]Ga~\>j'"b0SF$? YeaF f6ϛG.)OM"2cLYNT1rSl`vJ L>\|J0!oIiZ43DoAi>B ցߨЫJ04z{I9@s@}z`<3q/'//N>,-ł3pFrs3KEj/"85ѝ#xxaX0a3 bIp,|2]=sx6Xrգ/5j׻*6u/:b+#y`/g :oBܩ+:=l8? ?|GϏG|D>G?]O8g.4{4 < ~ʡk+k ^Uο ~8|7g|FW&F|Gaaگ {gD٠M5!vBk!) l2./\gV!1v@\xY5wxtM<#uS$,uY7-}k~u1!~VD<1"ʰtfߐJuu4i4MrE%ƽd$QBSWA5VFey4нQOGZMXբ>yzVi|"Kc3IGo hA8 FEf2\M} Uj/.&mK)*mzˊmYUmkqu祳z+ꈴxS<9Fm$x.THPz"P%Z3ĵV< YታhD,NQ&VB<IP&s*FVƆT1bM3㨗dŷ/*^=QZzut_G3]˛m "-J6CZ_٪(Қ\;BfzN@\iIAPSCךt]ҙ`V~n\__Uo=H:"ho șXym y0`q}$٩7}]1tMyŶj?aw)çf{sx~3wrR9TC*pRdr(I"$2ωkw!5N4B再I1^1 ,) Zd&AT*KtĀ"ND h*H\D 43@NplrrT@ dZ +bW c\4ff){uXMni W3ڊnxDsЊ o I>E u&Idq,SV`kѰAeWx+g %rgD%0<B8ZZZ:B=L}B_ mh!!uD}(*m za ^*ij7wğƋƅw&qwH%'1%LjT{%j6"iK"N0؊|ie(́>W{ާol綣 '=jomt] דj37p?KOA4|)Ki5H.( rBB!* S}K)Vz.6+ʘJYES AA9,"/,uKҰr販pS^jhAڜ&*9#Ѹ!0XCHJ)__y1rYS"gV/x&#?CWow*q{.hgvcdSKnԐ@&|DVܽ b,&UFH&d&z*NpLo)=7YCG),"hI\1ΤAHbcRRp=3IǨ.rb /,gP{\}F6&J&x%1$Q5!z5eԢW E\oz'ʙ6,4KkU4,qpxqIJCbP_io'=-~oL7"Bd:j$2-7Db:rZZ0&Y"z+Obvx5Ezዲ:[<Ь][7pMy2:Z| lknvͻ_}vWõZyjL]un3ZT Sxs)]sj޺`BrvTkvU{%Z=/\dE ߀BW&Sk:6Mwwg. k6O;kڦP\ ƞOgr_zl|䩊5\Erfm)"%9%ъE1cV`2!EQR!hHF8"=f]Lc+kjܬhV]ոZTں.yĿѱ2jpgNbB(ȿc``%`I8%c Y@HN >T-W#f}:UE1LZqWh+kDiN#V i&i䶢. $#N\?km;I$m%m%bV^YuD] qf5di8傖X!j̰ظЀ:ۆpf]'5ƻM;_). {Dn82XUT jja޷R\H|cbYP›]3d `dZkc![BJ{ߵȹJn.h]3w3(`I<4 t_:s8/b>t+;<١3™?cSY;m|Y 2&~+Yӥd4{' J8#OoێZ-EvTs 䢥L3IQz#)VZy%Q=;g]hB֯t_@?w?;۴MrZYӽdZ0vuqvvר7Z_~1bw{3W>3='3{H^7+{v6z<=K{Fg%qo.yL=RW` { N틺"jJUWJi:uՕ䒰+"XܷDUTW\E]juuU\Щ'+ ]8\mE]juuUtSWOP])nاB{ fo vUu)+-0G`F]r8Z[D%pީ4+mgd&~7aWGgt@5;my"Co4Ěz|/p4dY6c|Lh" B3-gE򽈠2\^i19h}i:gMOOy߂sBţfr_|b[Y9VyUp.t31ZU Hg4Z:^m9}KKht_90ظvtZ& P/9|V6Bt< MA4,ӿAG efWKGi;/58I)lK L),{0޿g}岸n'-yo>#,<%gMʦ.pK@cc3.88ex'< O))f~o =g#.˿~;L'(f/sU>ȋ%S/->BޢӺ qք,,&DZy-`5b+kqvy\9ēԪKn@`ݾp{yM"B1k}hv.cQ=`E/-pQ`F_\1~` A~^EȜJZq뵛c?9.g%7q$\e5A{/EIJYP3sJ9Wŵw^rl=B6;$;ޝKZU0 FVl@+謶&T!GK-7)\k%萌z n. Jq`{du6JpѝR6OY^ wޛl5=ڡk_o&ęӛjLh+55xV g\}l`=VhUxLVstz ߣ9H?9fAҚ H14 `m%X;KJngQxѳldȭ!WAG5 %(Xgs4ndBAe, RLI:s [IT&Fޖ>3W!V#gq!/]j%+Eǜ_}ن,~}'15lz<{fnӏcUu f!磍PުĸQ0ɻdL\RIJ,11mlW9FǹvΓڐ-:.Ԍ=7F$T92_Ys_vhE.|=v7Rڧ5W^Wt#y}2]<+Dj֔&EXI^PgM'4/ {am68HJ20J3/cV 1N3 aH=H:l(R.帰 |2.teR6o0~hmvkqЗRIk CT"# `jH5ycb$dA!c)1 Ad0~`PK͒>Ŭ*4wVqm<2FJ.  dC"epEcd)ꦤk ;CI^q@~&Ζt͌|[O{Z)־^":pS53iJELW*:e:" 0^ |#@c;%GԷ.lWНS^ۍi՗% j seixecw XӸ1l}nz|7+#7Cѷ~_zA'[i oo1wϽ+w\R5kY{7y.b9Z,vvT\H97\yKW_޾mMY[Igxn͡)ҺI?NIs 9$tkIBS0 ZG= Y 4"QҧnJ^ȭ VyM`uPDq3>ij{Xiڞ&1w>V]i&IrDT]27g?|KO@[6UDɷ H~}~lA=ceQ6Uuņl;]2&?|>/.xQiz:u{Fnz2ވ>lLp눴qt;.E<.OG=]˛ꥴpEdFY`qQ|k7^%8omT59V%#_h*9\%$p GgY2OfQ+,.q7pM Puhfb|Hbs6|Y|xrC9>PnQ|\\sZ}v.3w$g`Kyf  $kc&rlP h΄%@^9RMԳ#CєU ^zd̓)eRU= igx4*z'$%.!Ja$9}\%8?|y4n43qM=_j*mDYDGˉQs(T,aR{zCKhMA=Fm2xǓD!&霉e-8ƵY#siƽ~뷚ׯIj1]FrwTmTHTO[mdudz.x`rEYW>5.&hIxm^HsRa!r3,ɢف RZ@k%QI SJT )K/SdY),X]3d `[kC v`5qoB76%aCì)uϽɆ_|s;]}1P#QLAQc=,Ysy}:2!&Úk@Zx܀c("sm_bJw[{_?&Z`pQjmvaS?WngrვگPԚ~!{o--)ۤ4_韣J1gY3gK6tJ$(cŃņ<^^7M<3N0ITxWL^ fTZr7QE4P;8M4OnZB xP*Wӓ}+ÉT,^^/x';47 'muq^*DĜtLC L TEFmM1#YrAГB)bA8r|6g&E͵+@62Vg;2nf)Gj/X, &66AjF8=f]8C"18'\ jWӎ}QTFm=`7i|TIcd2xJ!gN |>H,+oy bYiZŁ>QY4JE) &BXd)u)| L15Vܦ,x2eƍSKVhm-3RJq+IVڙgK#ЎqR-\DM8&u,fZc;YΪvYMJÖv1mzӖҎ ̒;&fFy.NgP5PFi7E{!s)u' :\Rޛdi,N %`` :49O@Q0:%烟t\v,?h9P,AU$T)D@$Qc!Xr8~eY_> j:W9U\=og;CWN)㠕I3d)Ƹ2 Y\ʞO#B]OSSӋ>Lj*ze#CjADUF!r4!igfVãt醆(1A QiJyXitiRJ(z]i&IrZekgqO%|ߖp9՛_zrBZZB:[ ϢsYh?>'[?[W߻|,,Vq:wjbCvwu6zj]Fgm8;+7xց]`jS_hT7h͇e>_Gt)ji[I< _oW/l'7fv%ӷw-MqJYGJJDj6'n(piӸۇ'^`54*QL}E d@,sKA *j' e\Ok8N&Ǩ&P0G` g5y,SQd] R >hE>n-ރ &j©d`0jI`8eE0b՞nSvF)2u\lds^u.xun}=ϫRcm_O5}/gᇧS ș J`)I3:}JVe'FL?e*+ 叼\q /0$ƵlOɶPi‚*BN@@֒~*ThOG >sAlv A[/*VGrvX`C]~<߇6BrD(焲dBrZ[!6BM'Lgge =Cy>Z[ rt) p QXRlmWb Q%4- OGOt9PPH<_[ 8cJp@+* Gu:D£wcrz{$Npևb?zvuY7;?R xt4_G.[:VfH,i3YVb&RI!yf}!x0hWjxLN"j;nAw)[V߷my1s1Ƴwa5@HG^Wf|pO6= V ܥ:m Zm;o/&m~λb7_ .E-7Z xHD5˗՛[|S)Iݲe";r"<$Z\T!:`E@PXeĉ]p5kv@㖜Zɨ /-+1Xu^"1UENsm31u zs=Jzm=}qǪ]^Qji?o~" }ȑY>?>08$ȡNk!&@@`~E^Qމ?SᙗY=V%Ф*YGqSBJV_Hw`*oܝ=v= HDo}ͳ- ~.O_oA1۬Rafw_c87BGCG>OJWV 8d& ^+']#Ƚ8 rHq7MzjJ | *)mLES_]$&LH(9 wxyBozuܤ6MjJ$ulYe@j`v)YA\WTqG]!4Y o v(yߎ o# ͂/Jfpq8-q]kdrj Ht>φ՞<{zN!% l| E͖bӵ & D1bB}yy z3 ^ʽZA lE[zc=FzzO?)|os~qLIT2e 1o*jNv`?Hh61B%9&Jj_uU*v ̉ٻjP!̅@[*r+DSm|y*[w*\/|vs5=4J.Kے|HOm݃{\pG2oKmp5jӫ57V;urkזwUj@hwR?I7+׆g_ZaٺcJ~YOɄ;n{?]qntgw~_ƻR ?qhὧRF4׏noiB^~R|Uth>s6n͍պ2 x9={y4\"b3tkAi;D"JbIH;rGGtw$&_ʹ(i@t-0 D!E>jk2hČ.!iT1k&c]I $pB mLһugE\u\߽y~VK%6׆_ſp=`*lr>Uko^yk>,}^o]x?5KX=u/9t"1}8; u`AvrXW (hMYz\>L@MCH(\і]vOV~\`}aoB_z6ֶR_1qQ9jűp٨ܘ> ᭆ(ȗ >([{gtnf-zxc4"eah>!i1>=iqZb1;.|+SY4Y,/!|:B.ԐQXLC@]J!ShD])-yxeD ƗEr\33]%n ~CaN./`V_tz!/wɿ!9*ZU"fbd C3aci Έa1 ѨISC]E+?;~*L֙kXgn+[ǮWwv(e2 xKH^{_Q6Ȣ͠#b2n;U9)ʞ[(:#y-1 DMC1R*jTmot' K\VH&*Awd֝ݑq;JyX2^ }XXxji= 7X}/ i2W5+5y T!۔6RBUrӞ R]r̍b%2ִi}C &J)@dfn^ӫ;#6M/=ry,Qu4jQۏQ{D,1Udbz B9YLBR 51)@V|Hi,^AXt& @bۙ.1ɁDj)uݺ;.HZc;GD#27OHor9Rk4P*nCnR(k_m(QفQ%-ΐ8pIa1ga0L i Ε՝=m^GUXbQ]/):jwjmԸU4y:W|ua>6ܔ.@zv5QxiL4T~XU $@rι!9%, 6B=l:XQ^z?yGvD{(?&G'|o}PcP:B7Thxq>n&z"gL.]ԃ 9 AU|VԻ(N+'ÃguF[YYkC9^Q65t懍6c#?kcVgQ舒:WlT>0˼Cu\ٻ+Q|ILFzt|%5 9n@4coVH֌0hu?TZjR$9GokLDJ&GUaĚ]CxPpZ$(6ћZeQ¶0bgح;{-(j6⋓g?_ο:c/vX-zː]-ܓMb -56%tEXqݴhƒ*f3=9jW )EQkJ$ SĚ$tƸmYح;=I{Z勺!ܻ¶ Os~ͼbdo, lwf:,uOGaH(W@ql(i[xX|(Y w@%= rϺ^s LSx3 3-/#8 *SbèфՒKg(TWҮmK \> u(AqWS:7yq$t neX1&p"Wk!򥶩ZutIFFGH/I>حl35v3?ɔn?.vƥ]xz!, };T Oasm͈Ǣ WD90J5j0?hs0,S.3*Wj{V'd'I69V*J}: Zm4|_K'qX[G8XV͑tĔ U@~Q#v 9\_Y_^k};3MI:|srd?J i4)JnS^pg(3MƊ0ZX?O<Ӫɲyf_u<_޵q+"8ݘ@qѓZhR\|Bdyj%˶֖䵽k/؉əߐCwIMrEB0pF'˾o7P?/e ;^w(Bdt@likXk 3Th!'%Fa^3d<; C}+#7uȺֵUa[gy#l$Xk2 + }IN5˕*}CueeNP߼zÛww?Yo:4MyW?׼_]zYCrT9p+_V[y'ʗcޣ0{4W`uy]T`/:k},#ME7{ OC !9aDn]MލҶ>c2*&) dC(+`y\oon@ vm=cȟQGODk.2S;;i29'JKJ۩9Lkl,1 #B0` *2*ˣƀ^Y!aQ6fzi|"Kc3IGo A8 FEf2M} &,g)8:un]A݊jyz6Js[=Nyhg⚮$҂h{R,}RgTq\UӮUp3+΍`CpV3pղ+pJBW(]=Gw]eMwˈ \eij;\e)p JhM:W(3p+pmڕ(sp JХ,R,nw,1m+Rs+I^|8ZFݗ>qp_on2|%vr{ #q^|=͵ƈ-#}~G˔* T65,`qxvXs[iU;U?w"&s FƆT0bM3u8B)S ;4fUw,3N@Ϫ($Uᬪҥ!ݖUWAW*K[d)MË4 6Δ>Vvu$x(ӳQڣ]oKPGl7}>EE N8S$&N,E?_gA6A QNBakͬ9D)NUxY\](-%lO*TFV|;N1Ȣo9jqJQj*ɥ/Q[Fp׶14Xp/=#-N_x#q^a*uAq* tQ"Mwtܥ ƒn}}eH ae i|'i~Z)nUX^RnrVb+=+>x%U]_&y"@k±TjwWaR+UR(|=Y^FVyn\l&Hxy-~|gr6hKjǨe4R ,B!,X۟!+G(fӋ4gvOVU95XH4H.p<(SSU"-`k'd1d9&:'1^F&lWߛ.Y_'w'Z6E5$ȷ:y9@VcZ pݓTUu-2ULL3+ʘJYES ApA<2>RRxP^X!TIRiŚMR+@fҙԑh\bS,"' p`_oNԔ8'Pڦ_8 SuKҡjOz\?rop:$!Eu Id2{n:'wZZ0&Y"zhUzt≤ xb4I/=UrѰWDP4zkLE5>ӰvYP@H( HX ,`_nHIr<35+K[1 9:8i:"b'VoGGn8^)"EUdͤNT.c/|Ifeh]_XGWq%_Yzh`lG8t S>堦OѝL&/W!p98_^_뾈tȵ8~Vt2Ъ(S/ hI3Y̿d+f0}G=|o~4gCl[V/~ eKAH-,)&TD Qhz `#qS&FD9ePhB(2 qz5`d gKr0mW'_ɧq ʲެv[FxE{\ík+|*yt1>E?˚:U wOFaφA/^e<>63м"rk֯ b"e-yUd3GbZ(gZqK-MJ8Ma;"Ғ."ÊAP zru63(=3Dw`+y׽+w7-Id$Np#gJHby潶PiHXƃAiB"FO0ǃ@$SBe80(3Y`ji>E≶6icܾ|ZjY_M'Rk"H%Ǖ}n;e\>Wn^`<8QkF߾s% C*pRTr(I"$2ωkPwA=" "d]n[@6F8B;UQ41$8zNPTA|OL$АA3CMw) 4N Q娀|>.9a`hLہa`;wji|Lޟ7^[XjKO74j|^Ӟ9~;NgOxDsЊ o I>E :g8P)+4 dL'Z!Dt4ª&1)ʼ7$"K,5_?f:GF'uGY?\F##GO=WjZ>&J`,xB8zZZ:]ow6=oїzZ mBEv`=l/5ـBϟ&c?uE#XeQ"` kH\U4$O zo Ư(uXnDWzWKruٗK7tPUv^_yKeX&fU9oT=+`d^XV4K7H Ces4 ?<, |䔉 %׎$"[L ?6="RrT牍Y5lhq g<%QC"y(H˅Dd,᧊8AcHrOx^ᴲF@/kOVҘnt`r^0/l=V:JpG)~?HOv>]ځWI_a8=0VEm 'A1:xtRk?LO@#84M'5I\8HV)E LhEn}۱Gg+&V6q 뙙U_R6Csitlt_1ɱgGĎ-H齟 :0;Ej{,%>y3L^1͙4m8~5$#,4E Ǚ-$%Lῢ:ˉăr + FU'I1&TUk-8=F(TqM.~ziޘz矚|["~C`3x؂dI' N'JI:N 3J@%&}[ЌVS]}Uk5ͩ"K,Ң~/MٰD6%EP&4-VWOG`[dN׵}gf3])]D$mR#SW] p] }ou,9]M.YRtj>}^]|s* '/ܹͮX<f-G&_]`ՠи;pa+HYs6٭lN4K3v<GA4Ys_Z|OQy WwQޝrh.{Q4UQ'_PiAye96#%]RA(Z'L} (9)&K!S2LZ{is,Hlq,3jH]Ou42QǸָDphiJ#ATXI/F&k댯goa>`5d]q*[03S0L0uLM ū2-iq HEq7"Ukʄ++2nML-TI'?; ZK ͢F2_<(j9wZK)ˮ焊%Ses!(<26p9$2A33tֈ sN_"<-/FzZЇZAY8ݎ/ZYg8\.b b2CA8\Ɇ 걽g6XU4BA} *`iZF1}Eݛ?x -}IBKÞm1 v]U꿴/(;%?{ǨFLIq퀠=: E"ㆀeM!+liYK #F:'!A>9,h)D<\kT+N)}zuYE.SG(qPl[2qra }&7hb )r3ۄҬ?x"*(#頍e9JFU !9Gr6:$NS1tJ`U Jn)` Ғ9%jYX3,,yy<&"L.n@<i-էrCe{m ,bJH\ϭORqCCPxP:mGEJZBe2>fFs,6T^3H{DD02d*]/S aA<];ڪԖn*Ϛ*7E%\3[`Id## (;El?C}3B:فwJOhg0>cJxDarw/o~/???ϯrgpxW`}n%Df/&_/#@GCuеs_U?u~~|츽j1Tm?t+H9auf%R 0[a|S6d\^r \{W!1v@.s_m"-իZ#-ke8U R~h@}z591ڊ#@'F5^FcЎ.~ 3[qF$9i-gyKĸB$JBh *ʨ,FfR֑$-$^lk1ȨXL[*h#jNQY ߦ}*$≭[V4 gl:sxcIx'0`PŕVKukI7N$x^*Q 2(AsUbUIPym1'"6 yoEs_8ҳC  fY1cC}HkJ=4*@PK1AxG"{vMjY <"ޫM/No~^~g58t\9SOͿmL?4xa|@Y*gP Pl|&Gi d Rl&qfQe{.%iּK\cI8h @-x뷊ߴ1%l3"+bd!tK r;c xۨHR-cr9z+Lp/.%W 3CFBO-1J֤+/G@>6ED>_J>VUDY74*r ,g- jT%K=, {l6䴳FFC4Ƣ`u4[mMr:3LhR /M<)]9w8;hH9 Ʃ)l~h׏݉r]uI9f,Kjf/X=`\yJ2VϣVzZV+xF+hXm)xORws|O?L< _wpT I5lU=gs+{i{ߌIeͳ7:?U:qq7G6&911󾞘ԟ:̏9aWGpzw4\uOV87o?x00qyu>R!d:HJYU~TK`(!<z49 = *IfOzcLȪZp{"Q])}CقTA҇*&h`$װ`Aׂ3), qT@ dZ_:9AXRxFLqO(VP)1ZݽϮqA fSsX3X_'d@f^-]?e[v-c,]oGW,#!8xٸ$ ~T[dqTb zAؕT,N%u;˝^iTaW`1r:+gNNǷuzl~Տ/vs’&UkqyEؽۛNpUeb7&un6.}ӣiݦ_tlk"ג6que ꥙ 7Z ߮ǟ߮79lhtΆ}~6 ߬fܺ}/on|E+o|ɴrO;0wv?%`XSitx@x3dq檇\?B,U5\N:lQ yT2p;)˹_yΓ<)xDB:,/EţsoA({AYA1]6_כ5tuGC3juR< O3gt=D^˻ࣃBs-bzܠ>3k"& Yܐ:DF*Lc,uvQLb0 p#xTyv3Cy3" ~sr { -e?/ G%#*Op;ڲxAЂ 3sZ)0'EdbRc-[Zі<2?Kʥ <4+^kbBP,~\-GشН';Cך6~C4x$\5:ra@ ʛHH@q&fI`pc4hf^Q:GS.)ƔH.cb┡ꁐshRNCTTqG.*Y3G)х8cW] B½;ӛl<2xr; O~BчAXc'OdRy܅44FH@hhI,{.c8 .2½geɕJx%fBs JLmJ˔"ga\1.Ek].U;2'Zȿ<(9cgyd}pܸ#I}KO>LRiTRFZсCN!h4:d$jЮt]blևQ?hbF454bZn:qϐ`U> 2}b9FuD60B`>嵩lRH#QZm RpjP`)gڒh=r0!%"aKERȻYYKvԋ֋Ӌ^KME#'|H(ݐD.ZMo QŤ]f=ϓyli&Mq+dbgf/_`/W^?y&nNE?FG;?P/ʷvn8rzVR;ӏR(5ep,PrVwYX//f?a=~@q;=$W766>f򷹭ϖep˫uL-u6fO~jלQw7QgAo{Ͷ~z93I߽y%Iw;/H}Q/nx/V!8>U`εTy+Ѩ55=sۣ)ҏLp y|;R03#[-~۝s0Z6RS> BUEPPt LN6X>ֹ'. :\h8AxgKPjS|ԖhI,Y-a >?df GldJ­Nku3m揷pkUGE? FO[N7E[Us 29^ z4&˝ x< 2;_u`ၺPxmVCoE1-NPD2V`TDCAe-挂GAcP9/ gs਄C v/1߲h&xg1P5-dUE\Jj:?Onښ$#AМf%5I3n'(Ns&SHNTLK.Zm#*FLx RYAI5s߄Isy}6/ϿZx\Iu,=4 |="a}}U` Izߤ޻p_0ƍg3SIJ(&' C"[1'Ԡ1}QB9a'VSNdW~p{2:fE 'r6)&Rc[~83pɧTG.B> pۻv"ojMtv,1J#2g.?44KF7W\˵%jo;ck^ϧ"ky~fz1*8{a\s>i-˹]q! ֧/p1dMkO9 򡞮ںaXf[Yew(ZG Y;h8Odg?]38T2 zmi#^'2;>Cnf+)=Qs8ˍjQi#GP -]]umeB׫on>on!|pTMGn{qy9Oztmbi YO7bZHFJŌ 闗rK/Wt** W_E_k#Ese8UR n4+y\? ?o;& ?'@.쯭\|?])n*I9βHx!4hyT9huOkª?Cr䉈IJ9&i!ЂFX hGeBwUjӛIPT'vJġXEvpQԑYt}vᡇ 5 /JC~?k0 IBQM׃TPNԒĀUX+U0!G@v!/L}]ТDçvC 9Z2@ q  ڨ<QJO5"\7nmzw\9]m雵ux9}LCy8ri X0WC@fZhSZpTT-;7H5& ,Qi+ሯ" */lVNw=)Ty^s5$ 0D)M0N$rP14ß]"Ek#Lw-kځW,핡 ų0&_ 3mC FBo9 D)_gم f)hXŰ}BFWZP|dpu0+|Iv}ޅs#9d`'@>䘓^$cs{oIA˲'{t`rJd:@ܹ4Y?fD|:zլ怔fN{W!VfzѼU:P^˧+aEP)"́J3錌j \HyH6,K"#`LxgXJ΀gQ0QAqB)]9%AuLrvh4Mm!egV UձF%laInWt,Kg-K}cނc0 (JP+/bD:-Ssmi#=0E#-_EA)c>h4MBH7,Ed:HpOʲr0vٰSuPFk!(\>M^-!1@ f^ /Ҝ JCblYS4+r~k{#.<\*}_o-ybY$rʔmpj"c;h+mT9/$0D93 %?{Wƭ_]/7EOsܶh/'0F\InJeXX7@XK!g&h.֪h8Y"<9㔓FcZA-?Nz\A?rop;$!ED< l!b`LD6Z7cz;'Zb4JkpJypZHWu 17F BvV;KUWR^[;Jފ,N ͝_s Hamsg,U0_,ISzn\$WKv.~=/163A;^~v6%uZ? ^e+hןd+vS p~ƍniVHF.A8?yc.8x6\nF0ݏ?Mͨz|Qep,RQ;TtF¡ʓxp\;NmAoMD梭mwm0]ObZٕ[m3꽎돪o|O1~gv1Mk1PmQodR/ۮ* _LU kKT(M-oQ9}qiI!FWb$eNKT (ͩ}~W\^MVpvgOs}5X?M DpXym y0`,M&$l}Gm+yԪMi0c8p5uvqPehful5ݗO\^6E,AE:K^ 'A&k!DHd/ [BΜkh˻ɏA(aߍNq&rEk lKtĀ"NDLhuB$.}qb"|43Ps5PU <#3%t:#~0rv iħ9)"yFC*݁-&K{;ElFϵcxDsЊ o I>E u&Idq,SV`kѰAeW<(dvzӪzz4Dre"P<Ĥ(h,} ̢Bohph^ty2\~1rCq=CS앳ZO3 ՁC\HH--Zc !^;+(-ѹWzZ }D}(ڲ Ćy?ȎRJ< p%ӱgO] "OM,2(J0Oyn.F*fNdRJ~^ݖu%?wG!qWg;Yh~۔Jiځ | sc>ޛbXIA 222AM{qΰMXOxi&zZu^1p4V]|6t H& ˇf"Ex)";riLN(v$ ޢ/dJV>ഡ(  =N'Nw\S?s͎{P<ųcJ,5\HԀ%T'h lExpZ+N^@>;3xP7ϴY)q=b(@7{D@dNkGHd7Hr_⅑ZC̮ L/ A b6uZbjG?-!oG9qa­;D:HxxHoIPJE,XZɌ/?Z~L|L*Z|tekp2h}H#Or$ID>ɑ'ˑMJǥrZ+e0i"9Bq/nG:A,omU鯏`R.W"@Em@vk |5Ƭb^ Zƃ? jo6LRAgjwtF*~~ zOv$”G3xSK吢 pe-wW/DeJĔN8S&&`UKϦ4pKZD YGby0.*#rP^J Hi)Q#,T2 3pJ{2 F/d~e*io?G萑ʲ ";@ϋ9n-yWQPsVuꉙ9}v|`!*RzH \҂]v !8./5s-)'GH ao9;m2QJJeYI jͥ0w7}\ur3ܛL˩\% e|:HK&wRd$}Pz0}'R8 h6*DH_v/;to١> hp lMǹi#'`8)2fo2#]wRd*I`Llm|<9W'QI Ñ8Csς7dZ,qC$)[6>(oi)y@rꇛlrckxsTL*)A!ʄhАHoc1f)Ns!(<'E46p*)cP !O5Zks 2%s]U-C> 3~JwxݛxW| 7R{$K0i>g֟.^!σEmTS *#jѕQDsNlYisU?xGPh!rNU3Kj\Hu.dN$*IiWO [,w&PcfZg>bvTX'DPϜ/sDCTLX-JD3!=, {l ig&Xjͩd$C5t*Ř[X;#~ gdGZ<-3V$⸕.Q%&Új'D&Z6y>< FsP8qS"0 }(MY^~}t)3_ߝjyCd}~tdg-rG{˵YBh\`;\Pq6KV[H[< [9$8^LNJBc9˛eLCDWIi 6 ъR- Bvug P ? >W+BmƣbꘓQApgTDYHcS DU6 )=? ۿִ m R &ĚI10c?]%ˎ7^Z0Ԯ{沫1{D>Y1SD$ש9?FY|s}0.Tw{{&|s.9F?\̪vaBF2u$t6 k0:2CA q4tTN5Tǣ|Wc;^@42rl]NY:q.#y`ʑr1E?'!7XT;:͙~;Dž";~N^|uw'wߝPO^ɻ7,Fff 1^2;_(jU9_pQx5(r:'!WMAFYB_Aʑ(MmbhW0t 3.[긼gR!1À@_J~i$ڱ6cyb qJ,UYO`/Pݜ𓶃8Y[uHĈ@+þiuO4gR?)=oeM䤵edq/I!UPiQY{K{G暰D]xM i|"Kc3IGo hA8 FEf2\M}!Nբ'TB3e-EJУ8o-o8Kyo0+d:[e6qTIy%Ppʐ VTU7ݵUJiyw\"Vq^)h!  @.\pIR6rs)/R4 m؃M&3`6pX8뻾9ùCwa2HWVQjlT7r>H YT[b=Eh#L]$Ax@ۺ$DT3ɹǞrvc"T!Qg+m$GŨ2 4agXlyn0x6l{XїT)M 01i_j mc"LuB9]hފIG@N@uxo CJ#Y`WkŽ 3iӽM[n~55y_Pj|ORxms6)K 0Xkl\0Zj6l\ xFO#[C˴%Reen锋*Fl%EUQ2T` b2G&SzqN'stK3_jBRd8 N6hmd@6Yla+6ZġMB=.iܰ3lėZxFOǟLF*߼u7( y&)3* fmL$Ȼ*$%JkA`xL~iW̨=n 9㝵nrN>2~ڳrّhDs޹pχm X[C 4ogv,EV мfu? qX}&35i 57Z ]<[wڒ%:Rhz~+b)-Y~os r ݓ(B l[ ]L"GQ$ k5&c]yb89sM5m58xs9\r)d-I4ATO #s*JlzM@j`,Qri;?tyycFy zd(ո{xs9#4-r~Q4-3⦴wdaCG xzIukTa92a^B,54+5ϸ,C9'xH,s["КJjW_.t ^d_ϥg~va+ǞzOl])`9Ʃ ;XjA‡鿿|խM~0CqW< {+5ĕ_2iٯI K ϋdVhrd0γz@;NЇgOs2L#F9LD 7}Vm- ݰkfx)bfMtl+n/.,^2v~ml2[kMU|Q y9q6YlցP h#bT>c(cSWPBoA;76e7NJé&w`~`S ~oh;K3b2e3ݪ}ͬ( v^40tLM]P:;2]QFQE.yS)TN!sA61 t(&L2Rlnzz(PmN>Ggpٹ:(u$tC%(p4:Һцd>chlJݘ190cA THΊ"m/2yJx!*M]!gk“s"ߊPdH ](e(P4dk[2 @Jy,l+ɷ>z>Y[woHY4uciupnz of*s|qct#!׊6QC%Bc#co}"fFor4}8ݮzo" â祖a<oqus\y+x<2ꆎSȟGǛ3 zM;\<*WlKQt"O8hr_z@^2yd~>}71tWcwd/F ($7d#7tsDF9 #2bH`L"kI  %BP-bHD)mq,AaGo'%%i"F3"IiR&Q9Y]&߀[zd5U[qv3Yi_-h* i|P|RPT+Ie xOܷTL ]w䤤ٖsOUyϓ<)TPجWٳ}YɗNN{k`mzBd\hSʵRJZ CNC5 H[X:{fE9uB֮LU}LUd^՚IW#_gכO=l3zj}O.bb{vW iz`zf6XtWק);hT:PeJTu^L|/q M+1Lϻs"_%7szާ<& (؛nW+FuAvvg/qop`G6b)=` 6F}6ʉƨ4 Ҫj)bǖl{A8T-} )^FR*ʘX4޳YI&sZy4M/L Cffzkfb8>۹=>߁*oQ ʒm!e"K%?\)<zK"TD&ާbɱ 8\ɗdAJ0HFflF, XL3vB M{vO{s["Wd]ҁx}TY. wNxe.NY9=!pƒQX(SrEDp>נmRF |HiUaS[-Pl Ef@[bf1o}.J͈MaZ2.iǮQg4ZeAaX*bb>D A,!EmB$o$̳L[Q皦-e-t@$"مF"WcAf+"ƈ{DqfSvE/"ڈĂ%*4w@IضJE> mRHdQXlgZ"3U*O>lK-i-L &kdJ-B>I9*7ub\e0EKO) E**,}lXx E&EoW/JUN[׾ ݓ9s nG]8ZXN\ۗ򉝯rtuJnB!=! C?w P*Ó)r['SB{*rӾ5[zQxǜp,牫ay.{UzV<LWzB uBpU6WrZ]TJorR•B-9!k<q*pUGϮpJAgO4N*.S*WUJzzpBU\Uq= \yp֝ yBpU޷Ν \UiQ;\WV {\{b&ahi%!+v>@ͼ9rhh_p1Mn.:"t%a&b@f|=(x\Vh,v9*zLcn7X܏fotJP~ɂ":5uvRd()*7ovovՄ{ɵrD]QYuE* (O| lO8O| J;%RB)Zý5pyl.H骔1ٻ6r$+~ڍؗy؈$4%(/2%QHJ`U9@^̢ XE{>,&͋CM/{p[| 3ғb`̧n_~̿s~!/=%yŷCcq 3{ikBQ-~ 4Ԩ\ թK L<&=jSZmǹReSa7ZS|* 1CHAW.Ytkjqe%MUBV'ڵ6 4S <Ƣ10D9 H>X {W6-* M7A4x[J\T^M"leKښ5YCo#}Wvf3TbR"-ڦd29G)$FYI* ŷQVSD vEr>Z-y$lDβ:X%ke-H6.`ⅱ%U ƊhPZ 5Bk`ABٴYaE[gݦMpޜ3Ty5|i-OeaQaG'"L*ձpyF((|Cizo@Ns F TgST,hYRhujeQ@^k FՍj廫c2=?AQ28ظ#^HM cĜQd& bIQ }r,N'ӵ/^D73vDL):eDbrJiR2gu@m]gwDdn_oyhAMsn&η(~FFe(J2\ a5*K} UTуݣ{ܓjBvzzl-9]H/CaQ8&:tҕD[$Es,ʷ} * u 6yn or<b"f̾?tWײ]:u YSJQ묀 =a82XB1rE_# DH.m= Bچ}s^ 7_?[>÷sp'bMVS) >^W=ٳo+hW=ArכKc+.ݿsv_}uk͞ "B;UۑiiESh ;N&Mdz86˥cFhA;QtI >ɢѫX! F-\sŎkO.Չ:{T^YlMp^%`rT6Yb2DȱT9w=[;OMĦ0vWDb'-ܕ<|zm"r ,%|FGOVU)6٪׈La͖-D^׈5e{*tHNה+<ŪKي7f`B#<7+ݡ=?&7grO_m2be%WHpj$_x gv)+k0bQB hmE xnn=+gD<^hKA.wsA#JʁM5 TL u!yD.zW$## 8s0זHxEPZQ蓵Nȗq7zK19=:Cjg=C#C&,R[] v1N|7z:]nr/ø2KZLVڲP7b&RI!yf}!a?:#}^\瞱<dc=}і><;\:03|yN믹11=Oa@.|\2z. sW{yJ.]\/rVܠ}жb2l+Lp2D#WZN/lKHD5]-zqq34nm.[F!#-CNU^Ð*Q'Z (m,·2.K9ks^ﰎ[MSe9 x_ ='\D咄6zTfK@aZU[1ZB} =jqok+ԛiZ>ڨiKmm[v[#;Ջ6z^g Cl4`J00h,KUiVstZm CeAbdW| 1RXcLL*}խ2V]Jru+00'fA:24"EQ6UC(uMёo};ovΖ groX5O.K[WENOm݃ԵY:4lc56|p*oM˜1]$Nmf7|ImN_ekx/I}m01s,Õwdvn/W9GZK#͚-ݜ:-l^v-J ğooO.}5m$MsCmW:n~ۜ';cN'CДrayTݵ y9qlr.o#b3tkAi;D"JbED;rG'tw"&_ʹ(4 z_tMDe&YB4>Ɉ}$"SܷvoĢ3YS.Ǭu%E* }3H6mjNq } ~K]~{=Y-M+Ύp&6M\ GڃzIWW6P6Ogi.SY*Eegicwس4jC.doӇgeh'xah|4;!;<gjRJ;4Jm :YKeD&Ro+&.:G-8VоB.G1ekLn٬}LӄЖז|=d^/ŬjcTR&{LezF'UqށuFr᠆bPR B#|Sc/IY[UͶ={r Ň383\̿__QXY$P!sV2qh3lblT7M+ R=eP!=TPfWN 'uf Z0YDv 0k:]n„ ~h| ɮHjw'j{h#ku[i @m g' -%E=ZJ%RG,j#ùEtHkxmARt6#zr,fbn½/vWe$ϫ>}gtuu^k0^ODB)Wlԥ2V偌"OJt}OFʧgn/,!{ %(0V!YHlFl.KXPuڡ3jGRQmN&7"Ϣ!1\(')[H:ѷ&&Ȫp_<· QeDEg˭caHl1"TWTQ!xm:p*0 "v];#"8"Z C}M(T+x`t(mQc[E6vT>hcv`u&m `jC-)2,J@!jRzMg*,'Tcf%;֝q18:'ɲ'pl dCZPh+Z;m*%bpǂݬcW] =@ݏ%1.˵媲dٚ|&|ۄ##a_?F.h0[Jyr{Nvu^N|c$iD;"*`&j© zi|&zpB˺i0b ɟ`}7n⣬/W+okÅɿ9| +_,'n-f׷JYqu9mB&!P!h($xK5 Y4lhbPC"JrxwED)T>Py"]ØܺjbX|t)8FҎY | \K-X8z|۱YѾ׵L9ie&CTZIC,)_n-.n lWl1i#f̭&PR:C)mc55!̨Sg9PI`9Z 3@f<;"E Cјr^dNF03,*͸Jb,#F+lp_ˏ%AjUioRif]ҧjO ɸUx&LKvnGjkZ C.`%!v-%3`#0`FٟCK}=E?_xLEQγ{dRP9F"/SoFpi71^W  ʊN[lx քT+K ոQo!=)E'QQ( | ;oץSɮL)HX՟+%m饜rgw+=C_4䘓2 M`覣έ*URٝ9MΪǓ>՝dܝ^8WÔ3`.͗^n[oq;(R+Mekr8b5UH/i8&qyb9F2 0vAѿ=w/Q Z>j=ɦQ*mh6(3B#i``ҫ'#_1KAx KZR4{^Q\ Ƿg'N_gǝ?|<>`:'tNa00.unA'i'`kҡK+f\UJϳw/Dž`;ːD܊pH1 S_ȁ.нQ榯hBm!P#*Zp\)2.ms z\*-K_:z@_uƭ ݩFM|uN3#/M°< a4~ПAF=iR;OuW6pJi0Ē$8WLB8zr,6"HC)adg}kH- Fɖ G0ϰ@uBnuTɤW bZ"XM.JjƓ9~MuHV޷VuCQ[^o|o7锃@[m&J(EV(=Wv.U:Ta\0s,B^r# $g\Kr6h͂LR^!'uSZ%z4}jlb"/H!0vY )ljRz\TknXe:AIkP`*|nMpo@!u7K5Jmc?~\9]i髵T %1dz x_@TjThպetRmHcl*U39]ro O9-r͙A6ƒEmCj.ZV'VQKٛiJ ̆@@q:gG;C3kJRhz5st(jxU{}qqh[әU Ԅo;gCFׅ]`zVwݛp9(mEd@jj+lx 9 %;5;mLvm2XF1Xr-( Xkn9 SJ$o7"lP6 NeW6D ]>Ox8ry )0t}f2VϬTߢ}UO_}oWTy$(<`ݤDzL]9 Yu8ịUh4Bkb2( ؂!*$;A 0Ut8/ԳЇ86X?V tMQm+v,KmBJ?&MyvSzU!=`ҚþԕЦ哎|)rq[f2JA62/3tH2y?{*)|zvfPK ^?`Н4 ^ag@iuxlwu}p. WJK 2fUpz緲NoϦnV'bT1! Ul=gX$g6g7@R3PAK@;yu'&' ,nj)mRw9Ŋ)\ֽ2%a)S+y9Mȵ,g܈rf5Vblmy^S y^,&,dI,0S|0%6Zw܁3Z2-|D]lHliNʨ96ӞM7{=L6L7t*;ly&< UUJ :ɷD3*fcS{oߵiND[uV\U/AËeD!e!xpRˍD{$"﵌FMFSZoX{33NޢD;wmbRh*ؐΡw [@s3Xd+z\lKZ&CRT Ɖ 7XVqS"i/`rjVpC%$RdLܘ=WmpC|[:&1ٴΔ_d:GrTMw;|W~O'k-Kgv}}mP9,Rg2D'(O#R!i)iŜ(aHT X^8'S@[=h$uV|^ыoppw[=uW/T;OՆh: $|2.0X8-Tx(H=*3a"\Z#*o5Hә`;vS$|L%cĚ Je;FjIkm $WLS6g LcqIK7ZWI+W NL0eBa@d?Ʃ)=atOegQT.{<"(b=c2F)cR _ )((\h Br?u>WUÀ05!6!bp5Vj\DE(c,gy;0>i}$C>ʔd%QiR//c8Ìо[)~1%ak{E7vnU<  w CY:tieLJynz#l7u2zH[QZ/iI>Wr*W7MѸ-jP_ NPk1%UƥmNa_K%wK@XOh V?)FT7:[RACʗ&aXGkeh0Cwc?Ff y4H)ʝFgЌ/b:)0MP]o#7Wr v`pKlpv'>=m#Z~-zLYqMVbW}U%=2( )C +PiQY 䀑Ω{ӑ%jUv'$GΊ5M)AgEdyi,8tV 4=QY W:F߽*j'wGÖQJ&vʂToXQvUԑG2ks  [&J(Adx预C @ )𢚌nS AUEV NBekWХ,4.E-*K1|ji9ăBH!)DZ8n_$с/|2pT!*N_Zj[\|3e#:0!zlwhCy7mwwUK߬`y/;8 ^}7 kjtZ6J;Hu HJS7H5=3N+{0Z\D)OVx#+5%rֱ$'N@=@ͼ)~UR.7{MOS$sbRQ"@Z\kDАDfc @Mjzh'1l#}jy;LbPh5prɸ#-bi#n5Z v@G*gXw6-|ֲՇA||ZUf%JNrWZuUߏ;T%KTniEygn8D9p^*VC TA$18Cѻ% d($qE5 igRiEрښ"uhfNR/EJQl85x):&=fh8U!eNVX .^S4Lƫγ&񧝟tXv\ ~v$ /@BAEd1ju9pZZaL:8 b2B2WCYՉSzEkT GIЃ$S3G]'饧JS=W~"c*.X|My޷9R@$X Gi X,>/@ِ"0c$1HzS{0VcюbvpRV;^4ʮyc;&Uݤj֙H`i@Bu-Õ`AK0D4 wR(H9wFVƆT6τP]:ˠY*H:]Ցt3Ig֋]A?^_NfvaT5t0ktD4h#+-IS5Z˵~ [?}oܵSR:FD9S Q0 2OCc h%ޔ)1Uv8Տ[ ΡiW:k}6ie}P{\uEAC tZ%'A! "ID9Ez2Lgb,[@6 k1 `QލN!SDC PJe-p\'"3{:AeK>1@C>jR<3$gxozrD >^0f0pu`ëaiHخ:vv`dm_="Z7XPO+gUF/$B2FЊ$" u&IeqBY,`kѱAe3]VGRL6vJz#ąGoN ʂt\4\y2hrrRbzgp OymJF ggMe;ovSǏ; G_6`{!oh3ӆ+3oѪ=O4i'$8|hk=ܧgsIa:kԩ>A h]%k4m:۴myj&~Z:MukԸAնvhߴW~7.jﶾߙ܍~zk9k]$?84FC ?]Mٿv_z4O?)x2zQo%VW#u*R:J+lt{<#<Gy GTʯIE'8!pNJ.e`J)äKc)Sr%MZ{ V; TG#uCa7td*O+ 0䩰G^l8D9v9Onn,P\^r m??uWi-N1{} SPnSu’l l/@uvD;ͫRo!$YTN~^7y`& Wwɥ嵋ǔSRč~!xʈNmr1^gա/$'.G.A](]0;yą6~2b ^%e|p9I%(Dպ@墶8eG~<UC]nfN$xĢZ"IEjJ)J 4"Q{#>p,StLβD59*7|f^v)2M(|IŐF.{F+7A&:[VF>(&tоVHGqf+I SW4:pt#+>V3ۓ^V8=F(UqM..8HpM4m|olvpP윗v́_rͿAԳ}棪 K48!`p%/ 8:`iHT`NE9:/c:e0lBs"lȝ "D%0Jhj+.= ,q/{3Tv>q0jslmmwP@POib*9*R3+D"rjq p$aWZ̑]J?:~)uhbDH`*ckIX JDeՉJZhKEU6F i&.ଡ଼BD DQR5pR9& Ζmx=}Wxvnnf4mxfU[qt7w}7[65d˦| nnwN>]rJ7z. r?As=o" *d8:5RKުwj䛶كގF}]l&\ǏOsVN OZ{Z| XVd\_Sz,OG~qg.QkTj ]ޣ98[+2D'Oy$>C 7@G,Df^ |~TK?95Z_g9>fQ?b_-$ }73 q!_?~ݼ %\Dcz??#>8_J];{n3 7o VJZe;[ШV>.?\48$8hkn>>^1tRMQ;!bO8N왇K٩{28vbݣܘ qH[̹^kڇɏN4_>a1!{o蹚80 ?mtIH X3Olox2dTE6UHXlv Mb*!k.S랥?O{##VGgkTp IblVd`X+bRYVh;j*oنMgٞ5Nw2r jHdi :9g6yԦ&=B^58YߤI}QNlBm-JXْ($nsjBMH_~~}nȥ"%dlө}&>O=s{CxLO4$ hmE ꡩm㇥OK\,$k)@MsMOaVDC!RcMZ9Q9+'%pvnj_O+{"$K1=7a|VnO؟;]yn=ϡsf[a64ÐkTz_Mg"E~uj_X(zV^:SkH 0`]KβCKo099\e\}4:֮0*pH .){ u["T4&Q.;` Z݇8aR&Gӛ#_NH zS#XyOei.^u, GڥS/=MfmޕgZLDj f ^ZR$HCe{yQS6G vn7cRy&JƛX|ZWdY&XlLQ;+VJ6k|TZhFC c !*"ŠH%Pmg;&Ζv6o'(\~ [Il7lOaGB L*8sI hŋ<ߐdt~4&Qo_YS F`*T'Uh[OJ9S[R`k]"At}O8uNO퉧+$Jm8U @1NI[1bJ(af1VRn8IzLC+^D'VZdd<|ŗi5M+Yx3D EB)QwdWlrJid +M't@m]H흟l_ZɃuS_l|qn|dXoQlL!`iV*Y UTуͣy܀r}nD밪u{g_'mcMt\ !7wץky@(/FaPFA5't|.K|nfiТCz鑍bٷ&j)סu c7ռrUn;XvChcY4YD((MZ./q+ U9:!C_HU1E(*,&%I:Wq\eԀa&0\RF΅!Z0j!0Y8[vѼbJMiiӎqvk:_LD/?`<=9s*v4??<ћ_L%0g|B(|.*UL*igFty11H=blkS mEU:>Œ*B"!x`UyGJ7A֧,S:k@u!bg;ցS+1tgc8̣xo;鷉 y6óM}yDoku:/EOUZ8 %Z4Dq0ı($dad96B%x.?u=yk lXP\ԠMJ%JJjj;9$?t< S:kz|x0rq8@'$ƜbrTPb`LNXpEPZѓNS$9/܍V\=X'{bSc]klv[840I?zGv'Y@O /p̀?\r3yF7ˍyGv>8Ls‡'m9nNwcr]5e1ɎejLD?XHD siedÜcvw :67Q|Ejĵ|r(Y@Qt}d2&og[d&8Nxc)Kp rr=|y8aYU!*ÐʌQ 6z@Pۘyĉ]pdk{p wxåF2%eP5`K e턓8{M7T0(vƅ{J&Kmqם@uݯ^˽D~-L&.Ciؤu-ޏ-aDhN08{h$N"bSj!]5 (^l.bl1;2osdbֈ !9kbXu"g 9FI>:JvȻM/4/r~P.]~2co^:ݜnclx<^03e3ׯ^MbyųYTJf|~[}4 $n e74ˢakتݮv/+eZjJTz$dM|e&WR%rvcCs+",\a6*Z*&9&C${A]{ev<Yş_8Yw5vwc3gfxjkVo1ramrΦav 8f7L=߅z:']ҍw|Tun6ӦCRmq~g=Fw$}ݎtjޣݸ>f܃x1W7ޖmy6+y'0kn,MA\#4Yˈ%{2P|x6ڲicmVdU+h_zJS5ALA;U8sÊާ(<=iGdhTs/ /#/Rf| Ԏ_P:ANӳs]H5$2j4p3#|sksXA^^`=UҚ5q37?ThiU-2(!SRK \BS)dc޴XɈ-u_*DggZ{H_Kn/Fx7F cAB?%^SzA{gHaHSMd3:]*@>9.h-D<\k$vNѾRl6x6iN2C(oRLOv-oRޤFv/V)!9"EmyP 7,rB2*P8ITyUV)#!qzQƠSbTrKJ#c1q6#^ʓ`a1 ea,=*B`6ϫ!ۋ[DL~ne}7v8?##vRLfOā)sTT"c#R%DiQvgDsll`;8˰ɥAm#t靑RlFl7PPvڪ0j{>L"{L>0*k9X[ltt$!$iA2Zdўǜx"H u.΂օ$H0g36~  b1uaD="x/sN4Fz<:;>K1+x m=#s9wIeD*@npj<(`)CӀ\#{d 2L >C(].8[98qq?bZ#.RZM=.ާdPw8J|T6VWd%jΠ<ğ^gMD<_G728SQ` k{B=;ngSÍNzLj{&} J¡+= Z.Xl80?ڣϾc_Wˏc*7vf3ѭsYBYٵZ,\ooS_gyۏpp)V˒S]fj;_z*I/O6)?v4M .I Pu\R]{J᠘C>ͺ|{@(zCΣXQhIg+{/$D"Vr+]BKf*xBdw& ^ We<}]Em18+6V;E!2aN MPz+8[˕Ln>qdy,'J]Mܓ2_nsw8>;8 K KC5cT8rs'5P<-O{%o:~twuYGw}4xddCCj*j 6 ъR-B\d P 7':B=nt,)𠌕6Q!0HTpJ; ˉyE,Ϊ&L!8$c)'?5iߋϗ\W~&5 |ɼ51 pD58E%vtIΔrdxWGQzqm$Z@J8jx}9[# h)V\4}uX(BU85hչ8z tdtgK ַt l0Meyh``}:EvكEaYǎtò~ͨrq۫Q~0yjlI䝐k(G6Ȟ@ƐQ+(9E6V~1T`/,;k<cPm%_ݿmKz"4чZ!mv^tۛ=sk~j*qD#*d&MYYb AF$! /ʨ,F (=]|~?U45MS~ȨXL[*h#zNQg,Qiq7vlLl~y0(JN' +bRs%VʝqMw^:5aR]1LA {6.R`[q\y=z6;W'Y (ךY菰.Ds*_S >gcH./VDiM'F 4qo$с>ipTWdv_tr((W\U X C<ޗgO!6QU6vrsݻzзcӨ}ɲ[DxЪ?H=D`p_9V ukuZ3՛J{\*5R-Ll_ЍH(odʹ5XjgIŸNzC;;űo} *K{ixsTL*V& o!c(S}"u2ɤ;kuҽ/^OQ3/gCS6~}N6:C OSQ[>,\Xt`xT˫Q3r)ZQnDituyI'n7ZI:T%KAoySk T:\_w1q6j<4C@n)CO/'VH•x_ (K Թ@uk|K8e=kc\^ :%ٙ[J؄ U*"Ҍ]PBc^V<{yۋrӫ_욬Ot#vE4}dFs \*/I &1$`dTt[)q6# y(]L;vEmUeڽ(UKaRYﴠ hϑ?&" m/QĺD+uTp1q6aCu'm1uaD="({g:bZkoZM=.>J] a1EY<\B</j:UaaME9~|G MD?~l L\ֈJ. )9Y,.$(Ynr#V7vָYŃw⤍/|"?h y] $uvaČv[[&k|pjavuR~|eTOW5@l`9P˫U|2{ Tz[TP*nմ||T6VWd%d _z 0|RR+Z nkOE7&rV;#2q{)1㶡>mc4cƄ^)&@sQz+QEmbR\_A`*v?|q9d2&|g;<ռ71&P h ^GẸ+-4$pTA҇*&h`$gxN4%82Gb% وW^"« N\dSC}i7npz}>}֦M3^v,:v_j{6 OaFf@raNz\V*9)Nx5k+2nԹԳo4MT|*(dc ِ X@E-Nk)Ah`[ _yq!}>A| O' {f+j3˥e[EvϺ=5`*pH!% ㎫Y 1ۋ%X#DzpP!#BծǬ {c=%ձҲZAV$ǭt%[b ֈ*xBdٲK{)^3sP H:vpq. pwDi %^+|݉8% w&!Wr7'n"+]Ccmr grv[AD8T3F3 ,W>'%6K-$-xB[ 6. 6~CU@] Sx޵#"- 0̙v,eW[Dz8_,jIXv:@XMdUbJi'mF+- Bvug P\n?< *5C!Q߰]nJXyiaݍ4*L,( >r\8#Ҏ*(W)IHEz;iA( &UV!T+%tFX4JLSr%!j=N >9r9A'E) P% ǨH)8"!evJI4H 2_ c*vl{c?gqˎ7Y> B\v5fY7u~$&55>0gEnaze̴\OjSB 5y<"kXG#S1_Ӝ:pnG7&.ֿ55׬2GGZG Y;x4]LlPMo] mcU8mdi> />?'>w}=Ez#އ'!XT J;{~pu?cϧx T~ӏ?]/8qc8i#Ag ѫï-ek+k ^U>~~~L}6}ȑXI䓐;U #CF_Aʑ.IӞoh&ga1!Lk!() ^Vr-{;V!1~@/c>=&wxtM>2QCx3-p?94Q\FF! /@ZceTE#rPvjG^iHAey4V8tVq4aQܨg4մw;Evqv}u4QJ^ ;Z0ʲ*+{sQMS֨LkpU21L $lS<9pmz=fa*d⌄(+5"cOݑyU&')t@h=Hazphi wA XH 7N$с}2@!*NV\#{Z)1VWػP+t{qVwsG˻ kq6NrwͿGͨd95`#;ݳt,Z fXT-jjmtYeT/M_R\Y%Z"F9ɕIb[#+cCrWG z:ա(k3u WV7i─~E5ʐ` FY&R,S+ke*UW[Q3VzKE[J$(3-%rښ*!,W9lo4;S1LOݲ`w{uMS"|Z5/xm5*If2F| *L⸲TYa*H%Kux(dv:Ӫz:4DrN^"P,Ĥzo4ZI^Eh>_?iR$tGš:< WjJ>0g&JNi EBbhki`u2;/&GmŅТEfw%CCۈz(*m va'Y/PYFt/쭫q]d܃I1r9 R Iu1:GVPv2&H}6g#LNX4.jg1*D};nb񯷥3"g;:"7pLҺsϨjJ\U`ټ)>>HhgvH (ȗ@&lE=h$:-&Gwwrg1PFH&%5Χ2)CZoKO1A3ᘴVEfqWjZ~6<&NzYvG|#~dv H )"T l!b'=*rc%(:+o];XE;yDB+Z#²( Zp,@! =N0uۅ|^zP(l}-WBͱ$cb,·ő$ťTH(vRb8^s!yHb\tQN<`ۣ)ZCFJDz7a 6&(7P#9dP_[Oϓ?pRbUH]hYDhDr?~J^3;2Qy+HP9ݖr9$kcBVrǼ'C* e׻@H"Ĩ'Ǖg4%/,@ZTՌƗ_;wJ/_S2i҉ۉAY$&T{!hz+9.Q6軻OeAooAvѴhFS-&_7 >mxޫv 759}1²[|T+UZ8ΓTgB0tD\ U*pJTP9u*yS3֫\fH<+atL`6JMd`sk 2ge:%D )1Ze3?7P8Ld`;w}{c yuTnmwPe+zrRE+Nx"rx{%u-J*;{r?Bȍ"eǼU%&cQ$ dnJkKE]*Fjcp *g4Jlr0k9+9;7YjSW] I SxY.r$ռu %mlGjQwKd]r3?L -nϨ<nJKj:qroTE|*#%ȍt~tF8"#]uR$AƝI YP/rQe{is4e+d j\L("8OE2+$Ra#9Tܭי]^-x\`\yl+Ώpm?0oitx*@ءΒkӤZX_.L`4p`^@q,H sml$~^ѹa+C­L@j&j2r{ȢaW?//qE9u,Hə*<&څh 0NR#4Z"@$\k1Oj; :F>̜xߙ0}aH/u*{ccmdɑ4n3Y%[F EIO?z YhUq=(5L.*YWtV +lVax*RYA}oGIy![DӦWmkY/lY^4jSnrO}r/g21`$VKQc`prMsO䕂P\%dDn;7KWlgW9=)іpVT.Ő}-d|eiXՏeZpEJ!, n#ࢫ/&m0hVa14ױ"&*Axcvx=;m`eC[KhHM)Eq#6Q{ Ǡ.vkIȅƅwϹIs-ܨ-Mђ(ڮ8+Fn|J*~U5KjYs*DEh BDfJwZJc܉@'ɃP%p> P \0ԡxB +,CX!C9M5R8$P\f B]g+/=VS&j:V4] :JW[qV)m< h!#*, TRС;l,Ѵ !9(" Qgq AŒ(os UdsHWa!RF &iX@Yɔ'~Zp?Q8^,ͤ\9|}ևbgoN\r/'׏Q =Zs/<|I1:#@NR|NޅQ& ̸ٻyҼժˁ 7.fLfq>4Ԡ1 r8<N}bN`},ɚ# xeðXf3,# ̫%pпhz|9f7erV,_FFnQU㲑٨ %?|+9bXcSmnT!NZwԯ;;At;~}~˛_;ݛQ|+0KS-W[ @%Cuïv_WqUE/gWCCw.YyrP"6_ANˑ޸.-m-B ܈B2^L`qyS^2?Dp+bPx"\[Rmrm[mՈTRJPW1'єќ%ƃF( hDbr$U:q2 >qhKYxz=b$%G9&i!P+FX hqFeBw&gke=lh '6!3ƶy߃R=zg?nF_t_WuAOt,Fo,%,mZ;L'F0Uv*ӨTP./U*Ė pP]:QZU}Dp_!TxWQDq=^ާsqy2YP^).ޞ*R9tñ_o섴v. 3/G}ҋIuY\+nv a1aRܘoM1[ue hDxSYGYQ--JЪUZźVڛR~0z}eI=HYsd۬+ R#SQqo}O a6N#R8qdԓE9B_4GmWEjSJۻ=T"+?o0+Y;6w6ɒv\:ddY1K/$ z]=̝kK#絝pymJ \kH`ZH&t1 eh`izΥsGI|6 HHE(1!152]$a{k}~ mQ^[@gեSu1l$z\UQ#'JeXvv} St@YZ_j@מOչOA N)$at 5K, _& ǻ* η,8ۺ|ʂ( 9ɑH2Ayysil'IR:)Ӓ7֤eUQ! ;>8>O9jwWhZ!|rRʼVٗlmlĈHs$L:##Z-E놡Է)Rʑ(k2Z A!D?J`=o iu6)ͩж8wnSk% K/MehosSkD]LrD⎲PWXޢNMdL ^rG\9c )1H£5(qx~DR!5P՜L'4!_ nwJTh[3Gq.=S$eX$R8qLg"m_h%[/D(n@aBC1h1Hvx6B\UF:h>?2H3c . X$6.=WiLs {WuSZS¤MO;;ihe8w39~g?0'Ǩɇrw{xL!·t[&5d>e.9E?sƓ%+i4YB*,2mRߡvL;)GYNxIXRqy-1grܤ,kJ Xb)C}dy*FF@8O+'.3`O9v 4 1#S-`A b@vQ$&ֿQsIu^֨`$4m+bZ,U6 Խ-@l@/3O-G*7?*2wxTR٭ ii|՜Oz͘#;ƷM1#ok2 7<;9۶Q; )UOC.ϫʽ=:,!qZa|ToH+_Gp7q9q聺SZlS!g8K*1IyIK,3 Ι E;fdZô6*mQe;bvb3T:Cp lrԦdhJ"Uü)fDL`9=*Fh`Ie8;Zzy9 OVrfRv4trp,(哺BC0(u~k2N%X0JF*rJ-/5{V.# d41͢VRyq;BC ޒ9;[8f?2HDouԚk3 :ڊ[؅oӶ26`nݯۋ޵2W;urm׺;wͭ+@hsMAݠ [;-uRWӍc(v>иF\B0Lț2:O iJn]<-\x5խUsV^gm tNC8zP"sYunU/dr?w\s5Hz%A MPT\6VǼwQHH:q'G tu$1uV"#k-<;-{ZQދ *%lvFP90J̕IzlSR_-<2]#ٺASkTXSLwдbz$ RH +Uت`kDPZ5K,t%+Zr/סwHuG:=Jg*ʒů~\qzW[6ø.鳁[Ʀx[7RJvD HX͋ga[| dy֮KݹETouw= (](_>\ks3o VDw|?,wC%-yms;ӵݔ9넿k_0Qg] }fwgӏgKk>\pW}{Fxi~zՒxA3Ը.n?b.vƭr7>j?_T VaMp5 5|n@$$'xO$ԗ$9VQMN 98ρr[6Dԉ).2yĒ qcVb|{HX5m8˚yXKT)hI8QHP/{5Txy]X=:(19xH@J"w*zcYkugDj7SbD?B}jzk}86cWn~y^j7gT6"8/ X31'菉<לl)a`ZF&ǘg%%FYrA'Nݱĉ\3s.mugeUj-ԕmlkp߽/xE QгOWx~'Ǘ8Fhޞz0J{3BFZ,T2y$8GcUݓY!`{!E1RkFXq;6 "`uill~r(Xjڱ6`i|a2d28aKMgVxR}p]XVoyv3bk1deEJXu@bK%1dlug=lI<)y1bǮ+[D,`27'%A;![a)47& 1—V6@8)2O$G0b 1Ld@)_;[TH=<})uVӒ"`igeރu*e@xȗRk++.wYX]܇]<{XM;v=䰛=܃ cUU9 GrFaPW~\Q3/KݤƀFkh!%/ $҂h^0](G}%9' #;BZBxr-:!O:% w9gNBFx0+c$ҭ("% 9rwj)9 jj^]h-f%YYbg^ =Cc!"Dd~x^@bB*a4 *81RERAʷ[uϊd:]8h}ه4){S%n.sn[z$jp1ꐎsH9q8tC:!琎sH*( yBWn2=(J(u~k@nClj|Bx P%#3f;B)MLs]pL*㹗y%siӌK9tβiC`@zi#qD<0-qdzɀ6jQ!VynbΣ_;A:lzMFj-`*+OȠ<$Ə׹2[ "0Z8HHh7tRF^m>L?+}miGK͙0_*U#uhp FA5)eRUxUtVH.KȆ*dKѽs΋z{8'.>y$(N'dRxph2K!0wOӒ>6MaMg .gooiohF&4g6}riPy@<^ DG%Y8W6Y'I8/]*.. ^KT!:28˓6BL4s&LK˸VZ42ڥ`W.s.:?/CeZDZ>'eg/+KGZbj#aߞ-}^}nffEhm q_vz/O~V5w~[n:q*ܩ"bޑ-`UF.؀L#Z`Y)'_sL `mRl;cm4d{,s9wc}4]p>Dj9II"Yev~^t8\۵$\һ ^tz `n-ԺMzxΛm׎/lRˆZm/v>x$ZׇdFzV ؗg/|IhS/I<ǟ7]/|w1!O(>;!H㣤j͏͵BŪsx9$iԾE@C@c MP޵#_v( %؝ b 6#b+va;-[I:@%VU#Uk2!(IH:r~t:{,Y'fPg<4@Zjy<:&$GGTݖt -娼E8F &3G!U#xPڃN $]I/:wKw}߫opv g+H/aolSw$Ĺ`ڍ ǫW7Uҙ|V@@ƓXF /.6Λ.8i~>I)<=yIWyХO>O`e.1YW2|'ؿ^p?i;ȖIiR h$Ld ib"[*vYnYO[8DЅ2?O1 ҽMx:pl'Av[[нu {oXx{U/Kͧ6Ug;eˁZޭ}l>M[{F T^Ųė72ue!"m>瓕I+>CoQݧi˩Cۋ {ʧ58)k pYϬoow@. {FL9,qE RZ@keQI -lR*).PAP%›<0s7&]_ıgb echZ)i1~iqNI\:#1bg|M)b2^5<ƈ"KQy`%7Q''OX!ZIɠd\pФmF3N}eA[\WP(^wH8 f5T5CiXILhSJ$D!ϵ(G"ۍ8Fj'@!lKqb"𐀜 !0Nו8wjz3i2rPdz3=ٷ;t\XuƷV 'mIq^( @p"IobN:?&`֡L9c3jkr|ژ%zR(a!$#NS9/`k9W39=cg*xJ3c_z_xP_xU!q d*c)FҖn".xYET'}3q _f+XL#5ol#euxK]r]X8_6ɽ0EPXԬRڣ45zW閒cq4(YܞőD5njE o46$ؾrڲ(P8C5F`\=sF˭010$]p20`-4,^b)q![R%Pďt28cّ`h ʿůIIs0m˗wK[ҭ[m ol4r@kD|1spHj&B'KJĬCW$o2^|0R0WbseIjIdr$$,$dkuLir e1.SLp%RrNttFNizqo,y !DTR!(i[7~fxhQ͊_W󷱏)VJݓcuG%٧xuťZ*l&߿@sc֊2Ii('C&k(h_kxj/ˌk\s{ysbqUpAZyp[}{eTm|1~vGS`l[Knm mkFlm 1ѸQԃL> ?.;znsn0MnmU[]VncVe#-m4H}2)\'Wv/?V[TN]~x|}΅=?_?f?QR(U·`ŽE~{I-Me 7*O߿P>.W-vmT((Vqt]z0+u]u²CB X,`veݿlWjV.6.l[L]z+G&6u@'D r[!J<V3B,dPʂNH9FǜTAqx*=QzwW<3<(Ip y.-P9Zp&Vd.h4=c N';4_͉tbHLtzb}}{w{V[ntkڀ[bbNvô8oFŃTnuؔԳ0w<bE5HkgZU*۞?!zk0{ݲK/}vN9w$-" 4;Yy`J{%M (cN!7wlE֥x>CT EU3z̆ԠӪigoۆގF[K}"½V Ḫ1m/n,^q;eݧs2'&4&eS#lƕ|cJb0ݞv BtoxtS݉dW߫_I#cgN0$y#A%>@-;.hX]#ףOƣ`.F]9dN5kNՖ'*[N6Ah)eїR>RRr= s<6hLNKe/B>R0ۣu6ݑs_.JBr2xn^^; eS]w!~>h)DP]>QA)Y`tmVEƍJ,EtM)[d2dC eɀƩbqѢw:9εs܆oS[2&fscPAKOF:#~gmqWP}2򶐲ս/־Jw)|yE,Y.:J0KgK#)'@z_]l5jP>p|zҦR)<=yIWyХh'%ѿ}U'~*VJ_vݧ4#h$Ld S f"&Ίb?wCNZv݂Wхf.iy"HЧto.?toMan^gRq~Ϋ&!@-Vmfl>M[{F T^ZW=oW^E ^~rZ6VJTq?zRtF;Ŏ@yKb*45'.XQicvLyr{rOiT6?4PО|0-u EJ$AyboG9#F4A[cx QAQ$er,:ƜF$ iPFrkFۙ8;o'zLgeRua{f Y.43G^X ҽ ti_8#\si7 :\Rޛ$"%m!QRŽ-P($ܴExlxf3a`8/Z7>qq &_=x4v͌h.67ө9g]h[ |k7%u$KmȂAB6g๸~虄hA[tTʡ!$8m=5E5_*j30?z@Yg\O{wXp}gxBRE0Pr ZJ2RlVAF>;'S.^U!g}k/h:{Я.9 C$R4;) ?ș >xsO}aK ]f|Dॏw l硙bƟٟ/RKs\)8?Bcl5`V)1dJ,+ywϧGϿ=G}չެYE:pYs2oZ?moƟ8"t՛Ho7v}뻱m*7m8;zmEH8-~:)o,ߴͯ']-M@؏qU gt̵mu Yc2kq|ech}|fwnMl:D!3[t->n|^_'W`x)hk-%zhن历$MCc?0ױ%oNm⾓z{ҷ=c˿eM-=lݿqпǢۺq6^zz{ m.&+a@cFY$.WM z?э,pē}g%-a5cE }73=1O;@@k;> 5텡0C5n<zڏ/l.w_._wL⌼c+> Po-[+:l՜}6MZ>襟}6pOx٨Po0X0׿Fjee:OWnNB0׆WZWJ{SO ?G6u MtS\jPkܻL۠3YxUh骄XZĻeu1_hwU>],/hw{J̳M!Sˆ|Ue'ן'nqlr1uo{7*lu':)勶Ս׮\fݛ7#\fvbncGV}FI|Ac(R䋕n~ưLtS+ğ<4?f|`2gg_9lga%l9XsVbo,/3Qqc&_);Wq}jf7ۘ~C6wګv|Dr{]+G6?ƣL\7rg$^KYU⬆S@BqLaRIQݚg?j ɖ\bsFS !\3YKKVB.EmUgG=>`(N 6u褐=1Nk9E26Zn j%RjM1Khd$bnd MJ\NR8qZ mt)9~#׋k/.RXCCK$II fLO 1 rp|sMP/fLcK cJ},+Wbl|g_aMpfNl>GfM'Ln]b(6ٜ8dumR: c&c zM=Qɲj0և{h(&<0rFIL>WinѰ^=u*L%p9 O؃9iL+\sAs騽E6ݥh>nj &<4}xXI["dqE9j1o3ZD˘^Mv#6$XK! FKiҼ6+)34h2ʓ"ȓ5/VWl4Hcvl*m&d]swAS%1K(d0>뤱0ȒnB,Xˍ {І종:v4&kr@,&TPqĻ>1rNc9g0]-XTj`t-!x8w|kӌ#d\ "(g.M,*xCE 2AQ3\ՍK)c+:+ ƅeDCv\k`Y7k`\m$y}#RL,Vx@5JЎ쭹)rFpٮ++2S s1N-vQg\Zx*1+7!6\alp`_!8_d )i4\n^FZR0!ЮV|!efP57jQaq3L |+aHs@$, %x&X MwY5u b_-Э41  cbj Q 9ZHpPRp 3CJA4 `fR'"\+MA7k"ƀQ".HqBxz֑e ^ߑ h ?A8'KMY%\4 fdYg}C4l-x_ZM챵/pۀ8ƾ3.dAX?5 F|q7N8mPZ+CpQx6?D/uwy6%R)\G[}c6FD4=B.I@.^̪s  LCwAKB*c8#l/Kpt2==YۚBP ]\ p a: |\AbN˃$rFH+:B/,< ZLg 2e#xk.Llc:Ȍf&SX(-TtP?zPDMCpM-1YiX)bl%'jRUBZ%'`1 #ldY4@<7u*wMM\421߬rq`@@?hd Vfƃ!Ѧ2R˴|tdqSsizL;.[CdgKwP7b]4sٸRۈ[})h;Zi^Kðjdz^3y&8F[3 r1ӳ! MҌԳMJq@!(!A/[ÉTzgȍm@D7kP"*2X2$0R K @O(ZS1VzC @c_ O "X$OrJډk~R1; R1TxbZ 0p\ k, F:.:jT<TuNc\3:[1ɴ 48 , ޹-x)Xf=h/`58F(LA8#exv%Yy~5صۘX_ǃ7Qkj] vQup@iQ°Nk@(B5Tr޹q$ zH}ȃ׹HA`=1Jپ+V]7W(d\G vzp : S4W0_N|7GZ<>^!t4o}o4ES9XǣA9}ULq:*UҾ9oA9ܘ˸$:6gz% }qp9zs4 }>[niF>7ﯾ@ע0])\F%曨oEC%-n{el KefEq̶(",(ʦ4>1>-fBii+勗Q)bj [ 7G|yx}ݰY1|lsWg9y{ϒWRb>kb)YkeYG{ ,{*kI6hn~~ꗟՀ,/?<-wgr]Yazp&5T(-,V$XImyLa%n/haCb̕FŸFE'L-,,]}G7.Lo~̗}0 {?/]цܶ]YH}MN8`"*jUDȢŋT_7&G~pgoROaŌeqzX,䏃ON{//?CzZیjq 픃Me nP_g|6rVq!qʀi]vY2 .2G9?^*K3#j yYd.=L;[{vL"z9x ~~:e(="o⎼'1ӊY"rH޿; n2YVɂiEg? ϥ;)e֌OMEμ {O|&hnMuB:Y`"}C9 ?=vZSU6f׍%6E#*HAn]y+oRr[K"pS.(q[cݲZWCX|h&|y![6'n&no rp99sݻzÅtI!^a+tp:3nua%G!(^f.S9%PYLԸtv<5tw +^fFZV 0jb]ӵs^g}ѤnX<_B S`s\Ag,3Eñ 1ՏtǑY(a61pf'=őeCem)#FY Jv ia;3tykiIo2-;~_wɦNiw^楏G虿36lg&L[VYĬ7%泂pq'lN*#D>/)$:uIj'.QB) !wR|V"0GDy͞宑ĹguwCIrpy&zvnvslW ~6<+W 0..yy1r#:z (VQQ[^W2Ȫ!nćK:|HֿfP>킲$)|(Y*jWAUNY-` R7/)wB;[m7NjwZ1ewd{O6ci?EO!c]9q-+V OUZ Umc",[ >))aT`J(kJ15!:U:%+Ë)GہMQX?B$p $(5G%3v's*E],K9>(.?^pMv_A{?ӥapWDNʤ7zYj(`Iz_n+'`Ds\92?Ү=ӷ⽺!r|x7k$q#Nu8g\2qxOtr0geٱΰhVM>S&f]Y/-C]#d}y蝟F! ay߮-{5$^fiv=~49;?ZkW_`{Mί֔.y s)V8^Rޕ{ݛˌ]uz싗an s>u?9˾]Y(U||x4 ;(ocZncKJolS3bc3^ͬ*k G^ę|4|wxf?LnlU[V7ݫncV6A}]"^~CM߄ܰcxQn}zpz秠߽/_| w/_f@LJpNܧ]-ohJ× rkB9օMS_7&NZ(,Js5 PE2i) (kΞy&` xU4(P+sҴ]\uU' AG_]I*蔷"DfXθL.wftb tڪt;;մx1Mc2r>Oǰ'C~Ϳ:3MkqRI5Abl5O16U(rVMrޕ$R%}0=hc0140<%(QMRՍYITh(I<{T_r |I?fDxՀnoM,̏pm%d}?g 5Z^ח?Vg\Qw~Rh_.T{K3'ϟ:+R*޷LO+ ؍)WM?}ncXy  ~&wIj:͖hQjcRaLvjPHӧW ?N)µJmtD`0:0qWR )ajUkto )vp*Sښ2xΙT4U9;;E.s0ؽZ]]:9ȟgKHR^9"(\ C#)&ȵ!k@Gh\gk'yCilYA߮T_mۭvK|ZUx?bngVdKѺ'S"k9 `{Nq+M.@e Q8Y2kva'!f``x/!ZE =)|1X+4Hy4s{&TE$R&sPtW1KKm'̒p7y_ތɻͲ@wVhjnټjT#EE\I(0 `Qh*ei8A2:G G3߉52qI}-@" [O { J$OְiߟF+KWtvNZZXA0Bq`WH#_ C)F+l\ܶ/=+f]KxvS8$0'ܚ]߮'R!i Qnl'j$S0)4|Vʂ=1Puֿ^ 6xHKG̥o2a`@ᲊH2MGӪMⳋ, ebA?;Akdz_ңx7z? U!hns;kn/Hٚ5(uQrB?!S(ziҨш*$ uLOe,:uye0 Wpr[R/mͦC3@YU*L~_BlJzS?)%!|2ׇ4?3zF#i%2|-_/Czk|  `~Gbe6z kV+VE/,Ze_/ףD:n`PkMRY>S4IkE3bРڻ'ſg+I1S-wݦYGV 0,F5'>$瓹 }sA4BEsyyMES>LO?w/ꦿhi}fGz A`ʔ.?T(&(ԋafK+}Ԉc]+u޷RmAюoE˻(n=S; NJ#q P U 8JbĘ"t"0&}V@i6(҄Hƀ8I1? 4}cC}}tzF-)߷Q˦CIuRexg9ZY#e4(Vd#"F)8( ZɎsq%EhKph݂SR,)2z DR x$jlƜM9`P4TJF6rIKD$2 RjDs$HETB@43e#gC>ۿoyEVl#'%aذ h)PLjHrð1XS)`}Or[@NwSfp# cdRfrZmmXA-7ǝm? #us X [$47@KOk0Lhteor07 z9!h74#&X|Ha+N8,<-"!G2BmmOgf6lCRzyD/]Z2[ ø" y@ixc J7ۛvȖAQ^h{gFq:XAj2+ NZ䕜M\]E=z"$mP% S H~}WK9 ٸ䎲 w\@CB?FLi "8;BU1R8hR Y:O, Gq zN,,jyhƬKbBJk8Nt9G#3"i }ü z{ E3JXbHorBZ|][r.|v+7*%ۙ~L1S2`K.-2NySn/ƝgHP$*8ʝ7BSϘ8P=x:FDz4‚Swu~@ E˖I/P,Kx w3mn|PO#/!١nRvQ  `U2M"%X1G%(ýd }ݒ%Fu1KntQk-1^ 8^FŔqƬ@ AHRVa SUVc&yeĀ(hj4"9g6f#gòl~_@e&WM}QVrU[b[Ak2 |k >9ftݜ›w=T}.uk#r%iuD]yeZc* 2uU,uYΘ!sKgAnƵs;}q,ms-Wp;`σҔtM^7N- l6OkyMSBq'NƴpHknnJ7;Rh7R[o5J6 fC]1DV˜8xT1JM>}+"'ȴݳBC`/(<\_p~|YLwKmyT5' EE@(Ko /%>ԎrFTI}h4WP} I%ܳT=ơMEmyy QOtZ>~73qWɆt\'ESzce.OښX9}APݢ3'ϟ|\: E*jvcq7jL7 GE+ivSb]uxWכq<i2Kp,dRqLKͥAOl;)=))=))*e"rcpc A85"4ŝ09$<*MQGCJ9ō8`1waʽ3øW#4'(xAcfb6r 'W:7{sW>/UU@^ZGuZu֪L]|t+1v]&s^k0~#[&ajJ8.uN̘& '\[w[1ߕ |%:OѶXrd+A~[Ga胟s%X`T9F$`y"`"RSF-a$EV D Cl32^}5m n>zP*o[}쵲l58 &p TQ͍QmQ'(c?.I|;ڽδ_WNy%i&L %W9v%秓AX,Qd eJ+gI14D!_:|L,0ͥŖZ#"; [฀QFJQpE)mUL[cR;ĬW逰 _jRz\Tknlfks6r6,@'ȑkWL)giT }il?اn-=;t:#-8 D?ȕPZP zS+ Pu0*F\%jwqWP\Q8"$pU"CWZ]\%*;W g$r1"컸JT]Gqű%r%jۮ\v+A0,^@%cjj%{UrK'ޏIu=ҺST-{16O?;smW|~0ܻE^t[4ɾhFXYr%IwsF۲3# [ HOCN/'ryԑd S2;m9&s8d\kؾk.ժQ{ikm9 lFهtn2tuRJp/3G3Lw)<ӡ(D$jΉL^LgRnb~xgE}\QW3}wTJOUM0Kq 0L-XgxB|zA.#N?b:|*c.yޗs|v2'8'2^qouVV1iG_& >5\5l^?" Y}=Yc'ɴ"_$w2rt|G{Y1'KI @l6T"{Ҋ\Qi-22:?-5SdlT@Zx/5DJ@Ȳd=o^$Ͽym`HKzS{],^+fcV-ap9ϒ;S^XB4(h24( %gfόH>(tb58ˍӛ f1ǼΝN(BlFZ A,t6*#f/BPr OT낏B>FR}3^ cR2fi94\n$U3VpP Wو]Kk B`dpr9+T+\BE\uWBA GW(I W+PYݪq\Ieap-3DRU*t\Jm":+A@t(W(*BJ+TqI\i ֞ P!3Վj]6⪃% {FW W32Sq*u\$ ӕ#Kw}> @ˏeNJOdV6-w{lۡ6PNg<-*g>OB~DnwrY#Uz[!ۢ\q39Js KMW(Ӹ5"t\J+#:+oԔ2@t^Q$3jMt1lY,7lh$>qLn Ԫv΋lrdh p#vzx !\`#\\i j:J.⪃9#  P +TT{3E\uWq,!\` *2B&xT#:+P+8\^W2⪓8Ii ;NW(3*?DBF\uWp|ȗ4Le T4˗YN1碙ZiCmQS]m`JR@JՌL/lJ2+/P3r= PY ۨ2(爫=p%kV1ay+E?U\5kޡHo~&* e\Ɉ]K!%+lZߡLsTpjq*}ራJH#&+$+!- @-w,t\J}Uq%J&:@e d:@g2⪃RJZ 'PBwj W`q]5z C#jG"x UJqA\-bp QWTpj- U qE\+~%O|q&|xW4A"o~5,3BoiTb%x MPbu~'Xj͜'4FOPckɌQPQ@`qckQ@3An}>=n@pF]>\䗋#ËMǗP+)^-?zwab;J{;qYoo4=9q~7KwP^K|oW`<&9SPFQzrU{P\৸^PBmL U:qA\ 4p[i\ նu@3vUwp+3r3m*O̫A?5rS\3bgpo3E|>*j4[{g"[eYAU1/[/L1O[zY/d@Yg>eϗEuN[ 5NB[wNz BAg>j)s]ԬJp]ޮ+*1[?(6Ku`R6eƹT|A9iܾT.EeFg{ɶ%ZA }^7u;7U6%ź2kp&衞$ :h_t!x,Ίd3eͶp]{ $@L eZ׀dl~ޭhZW:EmJlߡ+bɚ;qKK6\"vF8xswF$BZOљD\QD†>A*e\™ǚ@Aiv1J<2J?R wbp\gZz;n32REN׹Q-ʪ\ >R =^K9TXwXa5 ++^\YgwƭӫM4z˃yQojU4pЌK=C׾|1+.l-OYX?n[i9aW$:~8xWkՑeEG[SҲhuO 726p67u8 ֈ7Jy owP7[s!^mGJgj(3jCaYoF$084 6 "c52uea{>Ҫ:i#wƭ;Eh2 oL\'B Z8PmSdn@t Tj!#:+cX=)#Eʉ=ܡ6GF֊|7V.zCf+qizo *6`okVґU P > UZqA\9N4W̟C zAWV7ʣJ^+YK)l;/yɜj "TU7+]uyA`>5\U3%\5Si;V7ڵ9tTB tpr TpjWRJx%\`c :MWVp:PWĕ0puSZ:APvq*}hG>D\= 4\KBB^\P>t\J#:+ÔpedprbTpj}[ TvpijS{2|J~ܓǞնْQk?>ƫ+۟xwNQJj~֨YAXۦuպ}t""zu:fgXf`((r2c1 L>FA<wq8SV֜:ʥ`jW2WO+υWP3a8\ZnC1YǸV!k`\z3!TpeD\Z[+!\' W(*q* wprdpr+TLJ ֕@0 _ MW֋q* "0:S :KWq* S@ @dtyAE\uWF 8!\)93wrE[J+P" ғ_܈QfLBJ5"=~E3†ۢιN)e`L\ ^Q+E]ĕNsJ @adpP- UJq֬zcLi{'i u- my pe#vzׂPdpŕdQ*t\Ўz\ eX'7̵ +k,\ڶ6e4SmUq%^“m;ZC4ѻ"{ ! zOW:@"⪃ZV V@֊ @inF*NoVѺb۵`9)LPCeQfZhdؗ4Ll`h`!ʙH1\gd.zb2&W/9ְ.vyZ,md~Ulm,y]&VTMdt^ٗhz|g0/W"R=BM7?˄6'gty 3>07\Cg^"&?W/7r6_`IFMLp|K[v8`(?\[kYQx%i/J>0:w_C!r9@ժ_ӤZU$$X-;ki'e?llC_n=SY'q B!b~r|1'*y /{sb:P8P}/'멽*=}>S 8 rX0[[ mE!0ˡ L(>*or/4V߹7]=5R,>+luCP ޵e/m/ߏ88iQa)%E+;q.wzXJeg ıv)rH 3A/sGkZ HqM-Q-˲xbFՃpg3U4vig8s:_fV=q{uM^xAfoѶX"`}Hoh \7U \v+vvr0n]dJfΨAP-0Ć%NX=\EOmGot=mzOEKR,Am?ܼԯGٿk1!خ#_#4uM$P}zG~aZnh`]Vy wLڈfsT`|00FvH:r[pOXgRm]cKbY|k0XAsߪ7oupE%1pG[Wc(Vr5/x90C%[b|{KM͐f4эͬ3GDƓRGQL+d٤ RRTOl8^gn8;xOߥ\__yuo/>aM$(`~{iMV-~9g\}}3? 77o't`ܒH;!+U0!C>{yaѸQX Ul\pڜ ivl bwKCQPѭm6CNM r?S̖HFSbs u+&# FrDZ0ڈ kTg(}oB|;Ewm؍#w2,lІ[UpF2PULW$bi`6vZ)j@mrm(ץtJq.Dž|v;d\zZ`~,.OzyV5_DkAPZN"C`#S XL"B8\oN#Uԟ!3 eeiR^˃89-40c__V5OIupbRX\QCL#k[joQJ|dƷ⒟r}`* lhmk_7Tjj[^Tu]Q B߭uc~< ŏy%sJZ5݀zBS8EǸߝVylD\j4+ݸFJMf%W%AtU ᵤO@lIoT/[LKP¯z7GFhdmtC"0yzz9\Td(c̱|>yX 셡3#\e\xd-)70RZ tOB+/q惜+[>]yH& \rmP&qx.C)U@t1 >rRK, GF#E8 [QFJ+¸ Hi=hY m(S2o[}?3:{*.Tl<8 l32&ZzűCGO&XGCG&i~=ٶB䌂~kFEtG,/ ,k,iZRg}!dmJjٍ v]sU FWU#=9 /[6ץk^gZ\UD(ٜAWDc5!/`gz?#˴-8߽[.wi?󽬫tiL'W+*C*6TΆsN:'L]IǗuc&}sfq73Ѱ&ۻz`b臭GQ߼1^?A16:Ub'zR5 e'ɥ8iƙɤP(t>:֫1^C\uhg-Du)*s5pf ST;#ʱqĘzg3b}y}Lzr؞uk'JŰr]6P=R>T3ѷMGpBFL 5&hhjN:UA H1Vzb^B%85E'#%xմoZtTBcbɒT`UjVXZK"mSW6MC`ZZh%C2~p*p9;icgP x>}yq.\&-~7W"Z>5{wkxd[ήӀ0ҋ(/.g\۵n{`F浽uKDh^f滎7\ngOgҹ.õf­czMNyy37bGo]>[ 9 .&NԷ婻^n~=P֮y=[|^:t循܎yn2]y) Pid s%tTnyBʄ肨XGD"HD;BM[E(%E7%*cuB x9R2\c[&&͜ ']u*5Nƺ"1r%e^H|tFm,҇K.u].W7ҵզ:0MSi4uM4>hZ%1GB|ի`SKZ>fmlnm(V&&;=!ĺhh8JUyH[VjBuբ}=J'ڷ~\^nՓ-޿-5RԨՇz?jzz.cYC܆ tEڨhF')(R-l%,\FH=5ESQ¼ y=\¬?@8X*M(Z;xBb,M4_5s-_8}\wKr҆Ի96(kwe^R_ G_7al76:6J-9zHFlTfͤ >(FfvCCʻvaդ! o߭'Y3eתy,'R9D9S8*FEeĢ Ѩ\]cߝ*&\ `BXgԒ+)ytYQlvÓ)h)D``6C}W:p(*o/+%7#H+$!T++~s0\8&;)0=VJKIC4 TDh"[#dJ{+Z;َ0,bOޤvMgkj U^,|'L_݂̈z#Qp>([u)Lk/唌PUiRWlv'(æ!{JJ0FA)@,S-WxQlGtq^V$桠0`G>2Mߕ*7Gf1ױ<&sIY,K$9[ݤ&E\Uc9$'v14AEg[{җC@r(@Ri Q]`<fva8'rr^?8#d~cȺEָÿy~s:z{zy-#+ Tg)N`|3JNJO|q35=|,̿ mnKS">Eӆ_:2W`iCxYŴ]sm13)a |$'7 D? 7i殔k?ڠt4+3- Nsf73oZ>oaS~ L{EOAq]Ru U^m<#]Ĭ|SzwWT&WA6;RCOd VD?b>u{^Ղ^w+.v nnN7I=y= u# QYy`=e<g`pfd>~)Zܲ.mc2J&J S7PFL/eGQgNAcV9+5-={Q\[wfc^ `–QS.H{;חg[VX6Yi^uL})=P2&qS}OBzw͵&>FR[pNTE(ɢ j6jZIҙ:3:HR- 貰`##}Xoe%묍΄F|@5#FK!1Su՗j\ TH@Zc1t{2tdmΖI 3ƐjW$^7i6@R"奬5V`V;ۥjHJEt)I7xp&dPhˎ|$ Hc)$׫Y׊NWp A^ka28 5p;ozPT6 u%,|%2!OVF Eژ[MsZg,&⌲E5V4gGTyX\94a[]nXe+S5:M2,#)[zA&T gI:4`] \dK-xU11-t6#P 3 ^dz@X8pi=@. ҳ6R tY|i,KW40w<@G a.@|Pc!{<% 4A"eLĴB&cA>K .7X $HA/enmAq"q |d,T]dA*X?Q&XNPޚKmO?PuvHYLg%& QFB5 2ˑo6PJCGoETE8$eа.a*H%`Bx+IVB&`ȔZ 6mMuXe;kDd\?,1b2յεDnfZӨh%58)mޒENfb 5ZtIBDpJ6:hfRO#f|j-u3BP=nJp/ak-\eJ ## s کAV.j *Rvh4B`j,nrHJ>[ C! Q0lHl Oo-h]i`DԥBiF\p 4A7yb8R.N(¢#HSq#!Ym:$\Q\\41ꩈϩ |cR A5ՃWMJg&ɨ8J%DYYEuRtrZ%a݃v-%7)@Aj[$N v`(kH*3X…AapjYgh#=йO|J(nC U%q4>}=xo<KLYAWVwۥ>0):&%Id4BDCTܭb&iI5.#7ֵNW8.:hB=W OT/Ja7i>?߂25X8δUlł^z񏇛ی3"P_8 6#跈|˺ZlB>绡R hM?% ퟧ> UVG3KR9'g$# H F1@b$# H F1@b$# H F1@b$# H F1@b$# H F1@b$# H F1E)!@pO %AZ$KGa$ЗR# H F1@b$# H F1@b$# H F1@b$# H F1@b$# H F1@b$# H F1@_jjz׼Q)K}Bw,y<_&7 Ϭkr/'SO2ZYIwΉgfDYy6S]Yg9Y}?zDM=ڈ튥$A&yq-f_:6Mz|~v:m5.z{1ł Ko70WG6'|}H%uIhۋH"bc< ۣP9q@k% }(~l'?t𺞏?{q s8MeM8nc8a%<np{-gaG^Yscbc5혲#bFkwie8L(ֽu0ϋ X4^؛ug_7-!genk5mՔ6RcE# "u_^ey ʥԏhWa}*Zvl nDZJا܊y*X[Z1>h 5eb2F}k-j#=tȃYCgK{d7^vn-ߎF_MqQ%`|9Kr|pǗ:?}OP^J(tt21/e# aZ6{>|>2k8ꥍY-6诀˳fNo(1㢣.㖁XWtyA<諞ӓnm3 ,w^Z<o}Uz}󇊁?w.~6~$jgK>K6v_u-0hpm~M{d"@`+#+Xg3pSh-UƔ_]l5g`oCm&[d|cLmow+.;0M)%`'ܡ40 RiUnmS"J+lD4NzvAW}3Uе?e{-ry>ebRևцdkƥVފqk1F;O/I֋U:o\7욇/m~F$X.w^/˺b:ua /g0k(J)-F'':K_W"Uh3*r:[ ]gqR뛡M5FњsU'qnrHք(Bs )cmn &ڀgYvqmH*ȵxQ<ªgai P1*8UjdsU۴K4:%!Ȭy<<%+ ٩8?xNœ]ޕɗ*kV7MP ~UJCL;Jfc1 z$B Ǯ=q g&n M!-_Bv<矹o۫Ēfqy}T?L`~j'd:n Hhv"tϼv97f!^L_>Eg_6)..:UϺĸ"٢+GUTxɉ y{eYr}˫2|G{֮ iy1wtڹw<Jmy# $)|WubĚ- %>ZaheҠ~9~8#~ꈮv&eXǫM)%yXM,_7/5WI%ݮlӯm/&媣Xtzɋi$c); 8+hbR铳wY c<.I?l-oVk>.V>Tc>PzIs2]ݪj^xzw(²R,}Yv%ty37n:{C.C^16/ 㲙Id޶#)0J)MjW۫DwYWs_{ůwf1 eضh^}{}'P-樗eEy=I}1]x`h{&fGI$=V{G>?np#]1] LRM @홠3) o#YŒM낂iuχVz[ΰ)[<1uol'e#> 1vo $CW6DWƶ-~.Q2"͍%HvZ [G?8@mdh!8y&L9krM׳sKDmDOPcoٲ6%8Q 4tR,iJ(: J80Njz~qzO3tf`f8x3{dvo-@Y~U'qUyYunǺ%<.kUdƧ 3 Usz̷B}sfp,4[jܳVR;Rq2VKeZq nj vϕ|\`܈;P ;Ҳ>ODT%;4);lk)8"uMK8njE,[ ȹ %h"oGֶ<ˤ;҉Iæ[=oJ t#IJnٵӣ^Iv(;:(/ɦ&'cR):Ki4RE PeP,FU, K`I0ٱ. iBmP3,w6Jle[T ҵ 2 Wjl0rN~ lQ) Kψ+ h=\GF5 gԬZc}]!X!E>@ObϭTA';榈P2I] JFNYdRHt9!jvs18o/5ǴUgm6F&Q # ,p6xTG{ĝ8޺{7{!B1 5QU+0BH1MhW*ay#R yRǔK^<E*|a` ltEӦi}nFDm"+0 KA pvy~c8ҡİm^,Ȱ8=Sv&:K'Kw&t8$a(rx\| s[`oݥy-wG0.3?wYeQy/e: eRur Xd[oK0IR|]޶`ǵ$Z]^_N=4zZB*'u6:G;2/󒞍FI ޽lw-rm^g;.:T~BZ!kles1l&qt][s+󒓇@Uy8[}n*/rj1H-I-M!i (y\e]8#h|ЗqeOT[0 eŶCZ/8X0'CZǔz2:WR^=HkD,IZ!:2LBbLc93H" -rVƽ^{}Mz"enH}}(-bswa[va@=RPoYz$pI8yt f0h eao~%?\"Վ~=G21̢IfwڨfkxV,PMh|٠q\XRR2sÌ7`SAH]y xf1Z%J+njXmn@rQmWl5̍WLo%ܓlgm$lQͲUXf4Xozt;zKa{$Ntͭ+@hk3 1\-s>u4wlכ5_ps|˖ܭhG{˫s-'tj-ptf4\KvbZ;<9V|孇z'D94YaS?דZYXי[_KaU}n>^>\1HzF[ I&%H1]v$D$]9Idz#H< nLɫ,="#+%D˸(!%\[{Ko݂6;n#tCA$2x.dtGt(8e%g)*+jRtf@E=ϣP˻j,\q~3ٕa5`*f~дb:BxUxY$Y2O6 UYkx uE  l[Aun]hͩ4ػ܎GOYt" <8r/R(L@ ce rC Ld@sٯj;zs5bjTn E=. U+oVE"mp&䕫k9[mGa^h=Y}2v4چF\#T;sc1 "O5^D#;F󔈇5ú&1xZ=x dY0GbQI!=͸GSXցKQČnx8A:mgU#{3`*Kd@scqxS/`oJցQءBGV 'CG/rϿ(0E(9x]qM?%+c^8 䳁#MSQge3J R&UuΡ!0e5$EwEgQ#jUG6q\JW9uV77<7gyh8qd6 B&H))˴.pɻ@ooaY<͌4^j~`?@,g MM~#71MF:,a)e S,RQ56M@mSxjl*xΔAȥE'6'Q#v*/uν#9e,g|T43'0ɒKQTE'syaqj8' %ߗ>(hbo^#NRvXhX.2D,pY6HrLȢ BWkp7!,P9M#Ϲלt]f]yNo:Xlra=tuw`eչkN¥ŀҟ*SBp'A+DT6 :.~c`2/y}2cv48}BWu"\*? Hif ˓p:}& ťeTFTGS,I$#q,q"]S2ږ8-c=RVӌcml eg Oj 7 L 2n9_<0!~A~CA->Κ+rxRy3BVZXQgPTy./F+ǘ6u}x{Hș ^̦P:( 斄"nnj͜2ZC"1}QKۏqbծZm]jjwvJ,I%T$zf'' ˒-dL\ t$n8c/1 6 H8I` y.v3H\أH&b51LPN_K}jOvquN{մHP..vvq{V=ZhYw(^l I#&-C⍧8 cI q;Ǟ= Fh3q^9(@>(Ϙbqr\f4@RuŊ"dbDTE&'5NuΪd̆@gGl,'3MN\%MX3j[qFpί.BSf; 1Kr9\%kCeR' XhiB7AiODzsfYGwBE'N|]W)\7igm5:bl:n̐,FF/e~vf#$H#Pp]m ~Pq֊1~f3-o/یl+?.|7Z|nm+anܧ0c+خm ʾr/X[Oִr{O7mn,ݬ+/4cf6FbYGr'}og}9!zm{VMVNAGZΑұ$hJw_Wv/zIv4͎FĿ˽п"uӯ/߿O~yܼw{ZuD#0M6 | |vv=cY2*o>}?OB ֺ_Ύ$Iȃ)'/Hؐa埐{73 W!¢Pk@h" b+j>rKܯP&L! qhlJ5#m-< g%(>[~]3' dr8fӨa"*N88MG1`$Mn&N'UW3jZ*z5=z̴Wu>L/äô|)> iarl\MJE1`3rTQK7'ޠ`ZbNr)-[d{CFu*uI:fX^Ah߫J>_ć }4DY⏜~ ڨ" ٨hMy[ql zXX=~^j觻AtzQ1* {S4C_ UͿa |)oDxҮ8)~ti{wtay9$r j _Ԟ*kF Ki^R *Ҹ;IafU+l^6:>^pUY,s5 ?-⣯D.1Fc玩Qz:$0=cߥAgk{|&O-vψSɐr,"fN*E9t䔌f%"ZdAIE`DUJ=o{E9&/d}:ʞLqtd'NL=Jb1jEk&ys˫~P .SC"<DOJFQ&$3QSnfnǬfV70 G.˳ *+_-"VDsZǺ:HB4t(!YPˤ: |v.mcTuytyL=y٢|T|>k>?^naGl)gET!NQuRFc)r ܼ_8? بkC":#4PrdaZͨVNzYvڎJu !&CVP1ITZn}!fI97Ӊ783vDc+c0$X:[N: #j6.`EۥcFԦwDXr'7蓩fLE|[g#C[ů%L!Jr1V3aU ?`ݰTV,azd;Zp4١Em.Ǜ6qoҭ7`HTǙC*B26h/x:tSP-Q4YpNkM`]T:%BpۈzRnz?5bڮvCh7x<{eY^i֚2ͤ[3Nӷ߮/[ڐ,a;3VtZ\1f+aeɕ{?w9_aW6S}YK| >k3=&!UJ3M|[e(Rm˭PUW}̥{jTwfzT 2` Jt6zCVr(ؔE"E4kX1RD8 "T]>5l3{T)MM`dFZ'/D4 Ҿ[j mc"Lu 2$tM`{+w$I9.104g70L(рO#g_q蓛93x_q/zۻ}D|[`CMn6<䦫 g)8!6g2(d ),CífÆ!T~63]z6BπjIh &tgҞ=([W'^y|e-+,U~ҌMNz&=Tqͤwy~~ p+Ŕ+O׾̩59w4b?t#?޺I-㏏LX__km*&8oؠj/1^DήG[uo-Ѳ S/ܩ(2k 4}N~Vdt1U_ĺhsbu>ih)-,m+0}/^M7~\?9bJ3_qE?8#?-yGǣN_bKۿGコdX=.VV!=ӫZrܟ W־k\Bj} B6d{ŁWeIN밮Fyzнc__7=a?nb Ke-8Qo5X_SZސcѦ2?L'zi\;nhil;7wytW%n_x9ȅlԢD"FƖŸyHDAI䄵:$c]XlO'ȿ=Fc v\ )r2E0)8M ,jc*ddN@ U ;T!I7IfII9"Ii0P(Zm0Jduil7gp@<[{.o'ڊ 47uw?4fdw | ԱΒVfRki~i@[mPNE#;൮Ciy5>`qnai4Ǫhp|_KtpySi %L?-Y)/bJZɃJWԼ1#Gps>]oǩN`QyVN:H(`r2M"BFW'AǷ]{}L2aG+Ɵz:x 3캴(d#MvZBlkNUY&{o FԇȻ͐!G4yY@-5mg\sRZssf/wEeUEˉJY#i@Qh`hpX0!wEɻdooaA|h2=k0mc'`߼}@8s4Ө60tL .K(D:2\#:ʡ FzGz ^-8|ĄL6AfM!A!*t=!o}I(VEDʮjb0FeAJXem֕ %%Oab6kWʨL)mlI|+qvO|6ݟ2~6 4ӓC h]~n?: e̋$1ѥm_.O>zJ"TDuާbɱbqD/)i5Ji Tkdl&ndlUaa38 mc,4Hm:ϕop\\3<.cE&{f.NYr8cQ1t)""8 )٪FJ, i{5=jaRB)*lj ,v‡X`K,:Zi:Ik&橠v38]cԶj vWK7AYG#P< XccԀoeL|)bHH` g5Z%]{L:=> k$(G~56~J&:PɄ>Lsg'k "uFg{bT^Pr糒]40ur|(zぐ8B| NreI+Q*2fC.&v%-BEPmr|)(hQsv|z+|X]'p౔&"ɉ?Crq9FBЊ$r %eI$DR=HzKjI./ r%E+X]T3^G@ݔ(!OkEa6qa6g#x'[^[vF,Y|)ǐk mIXMx@t$Z;R}zt~8 )QM~m0: !YէmϞ̏OpuJvu=ڧ;Mݫk/,zkQ͈8;pzA;--nkߦoG?]tF;sޏe2~s[/ y_QwQM-y%׷t/f. kVĔ'XWjgm^VKnjuӻZ6nuVJGjÚ'gT}>cUFu'at8ĎՇZwC ?OG4>=bq?>|_֟=׫?C8|G`'݅ܯ鿽qCӯnlGYcZ~VeoӇUf_O.L9Cq QZ}{βwCNGlGa5 sh9K1K+oh \ބ8!s6Pw]iz#FM!;y$8K43U)E*ڡɂ1^i`&۠5aI{ae\^{~M/ȦM ؍ %D2>,Gfg3ԥ·K*EDoyUPJYs'Z;r2`-$%> _/?Vhڶ J7x7c<^?e3\7>zrIg{|[{B/H,#)*L>{ dJh"5*ljve]\} r`ya H09\1dQ!d[y}tPdJlͥ4|e'>}e%)׍ЕShqϝ )w3xL=Ƣ+up: :F7ǫOD," E8'D7 e)KHWrkdT`*$Zo֝5Z? ?;9M!ekOVn۪b^)e27Y>4گ|tˋFA.O=zGϖWW]gӫE1_l:[gOF v(6#uml>Яo\e?韯{~{M$?$_Q̪h":T쐝·TZ|/DkfQaf3wY,i*t4th cT0 \PaVZ]l'ׯLr'}13<%{?-dM+3^s>J;Hi`(jՍ!5`SmU]ީ=`|RP4cΝ[h?C>ge5ZYaџ˿}l,~{<ޫw_媳W P=+%Wb&) )@DIcbR4*=%^?!<{o3-OXm1wZk1)ڽ^6oGKVrSy`P-@e8ZJ`8Z% eyz sF&vBT116oR6!I[6Xwcgf]u>t|7s4M؛Yaޟ:?;@Ȧؾ/W8m)e+: Rt\ұ e8R2UɴLަN}F+" {DB"ȁɸU(vZluڜCF鄋|)"T_71sEJP%A(1xFi{ͺsǀM$X9w L\DHd5Ee6Y2ˠ1I0Z2XTQ3 e-e]jvPWd(#Ri#SV8jͺs7g}~? Qɖ} lI5cf^@$eVM_6Y3ElSh-)cYKEdӓɑk|ir`4Fel֝-c;_-l&B>-E??> 3ZF_?0g/b+ 6Ƃʤ)"jFJrlxΚ um9DS: BB(fSKdN1`)ֵ>h՝-v{'e ԬJ#JElUFj aAa+#WԶPm*>|K(<x<4A$LNdS1҄V$[e()HJ'!rA X,PJhs R-HNz`êZ9\+it#,2̉,!Pfݹ]^גq}!U;.e)>MkvCB:=isU׻>v:0 wp70 wKu v`n`n`:\p7A wCҁnn`n@[#K+rUW]qZȚe124CGem6G>I񻌲߃,C(KBYmn'Sv~YVU01a>!@NDB"DOCёYm&Njk? AUhyP1$쫹&e.2HSKuomMw_\ەpߥړ1cCMk]5+gنip΋w*DhkmcGEؗ]y)C69d0˒#>qߧjɲe$&_.;qs6 h㸤dO(p˓@}m Q/:~p^9XnM )br-4 IhEB!x3(/brʏ5A!a[7wDs3nҨS-*qFJ HT *W,IHY?9]Z΢$8E!L+%tFX4rLE4j=2>;@svN1H3RpDp"Cfݔ('T2 3p>ybn;[ 4eОԻf[io{]^78(Izۏ270ƃ?däٟ{lj?m.uzz<hnzފoLr]8l! ;&'JrN?IrIy6%7ZT1*bj,SQUTiY<_?l)yee]Z4+oM$ztzu}QZ/nNNpy^ $UP 9sJ͊OԴ͚gZ~=P?x{3h>x L\/a/fk.HԆLkwvXAi֑@m:auebpT(\$ w܏ٿO7Q<%Fm{WFN?y$,|~7_-FGr,wN%_g=߿@r_?}xo(go46;{ }wLז8teB׻ޯ7_ßcɥjS|娪32dza9լzMm bhU/װ}/#%!]%_eܟW ш!mqA-prK7H[$8ʨW'쪤''Δ%ƽd$CB0pXш,ӓ0.U=_p㸦IxJFY:/$U&<"(wpe1^Ƚ$No&|)+QzÊ&_W(쾷;G̳TnuYyóVu;cP)baVX+rӱp< ( UHDY:7z#ȱ9M97b畲xY Bdw^Xa)e%,SNyis6ԭt2!0X p`}K{wuQ2ŠhxSc]^+Ÿ]}Tdˇ7wwavViD2B"7Y$3ɗCR )8+}Pg 3Kq& HȑjPIEXfJ¥,D ENtvڱl T||,vn|M^0C-Ѩ1%IUDMpO%U!CF!;(A~frM = ͅZ 6su'^9grh,tP1?o}Yf#A"xRDL_s1jutZZ0&Y":8˲b4J.!2"^!TGIP"H@f`OKO\~p-Yѓcab$/7Ő%íTc)(Qv2bAX|_k!!0c$1H{cv1 ٓ;8):I>Xת&Uݤ}(5n's٤ lҕ yV_a%QL"U` ʕl .(bz5mʿWW+qf*ެ x|vHk҂!-SUe W_iI؎$UN ] (s\ev R}uphu/i gE@'8ё3%$<^@(4$a, 4!aLd_Jǔ39cw+*m60^_{jmZl95})E.X*8'XtZ%NjD.Q"ID9q .5D8Dȴ|l(D'~ ,һ)Tk9|hþ5ZP,AފJ:kT*H\D | jRCŁLY)GEJN<"tUtI%WVEd7xIQѨ%yYb)|PEP )"y2\~|厯3<,9^GfYCΧD !P$$ڿsANbŸF_ܳ8Zp`hۤ na^od!TBgn?er1nՑ(EbRcx֚( B >s@&Wr ;"H)ǪH\QBQ/cLq E pS 'A7B+-@ Lu.Jd Zj&z;؛t%RSG&75m kMi- s !P_Zf8,f1rp&5[_s?'fYr:9ܛ zܿmXm48"qi8HH!EjJ)J,gB3.m.t*<qWSzCL< ٧lU*ٶf`YrñCGτelOϮ8IGZi"E+Ǚ$%LG/G~ʠr@ՌAIRu IZ {OQ4Jlar 3}1}Ln|@b!_=5˯.bΩʒN,rNy&F`ͷdPtDU*TN8S%&St^UnDŽzyF J=>XRxFLSBLx`O]&c{ɼMgAvc}{c4cI",2%z(LR>~8q Q] DTjx8F0B] q^ \s9_hcH0o1A kIX JDeN4hTҠ.5mR$v]DB0*QD&g F8 6Ύ tgAy>,^Mp-j/nWyNb{|X>9ftݜ»w=}ć[V&qj+yvյ@h{Pĥ4OpMїܺ.ҹ[aj> &ܺnZG=7.[w_ydJ=[^w+[ޢI;d4] gT@/z} d% {?_4^&%WUz^> ewZ}I]ѪuƄmC~'d8zc_hNY M8j1!wuOe=_)yNTbտ/ɕBkyޅ=z,(4塓AIj\?gFX$ٴt%&>X<zylv7RstrPs|y}[~ 5E PJ㯴o"{SzZ߻wnԽsbE%1WbyOem9m?vWی8e\fӭΦG歄}`˪~[is:+,k;e0&540}!`y1e/ebWة.Ocpf0pkmFhfB Չ).QI\pH@Dms>Ǥ t.]&6@;F|"Y,He,٤0@HXSJ$g xk#dM=6'ӁT!L*[I+^~ 䘀JZ \1bs@M'6nx>K~?$TuznK6Bd4I3R$IЉ$$g"G#l49a)<4UQ[c! YrAГ8D>E3Bs>HFjչqjXX2vB] UNt7f!S\74~8>O (=fm1M%RJBŹ53fJ*S{> &CRؔڠ!RuYX3&ZuF0 s_Puڦ2j;{J$QK•s$Ϝdzi=J,+ҷ<bYi1XŁ22Qf9$ mq"bV!dժs7~ b5meD"vxrd,C$ CNHcD(']ݑvdi@rGLh@3ҒeR$Z:[u'@8]E٧jV#.r^].fqN MT1 ."PJ`4*rkbw/xX:vCU9솇a=m*m:Wqql ;L(5\4Z`bN!=JJj\ `,Qv!ʿ_AHBAH\uK@(fF"QI}FKH܈iRU?rujx?ک{;;}t!cʶ+Սy*xfN p!Z觏VCKJH!mHX%<9&`QXJBHrsAYp>-$38|1)ԋ?M~=_t9~G˚N>v\;]]t.p:HHO&~6 feL?v^HՈ Q:k%MhXA# WB d'=Dv)y91)1)!%a %L3eBTʄ;** J!ۍVm|[_emmSj)9Tc{w_M-y#QLqy$-4U`8&d. Xb9􀌃2+kw9 N_W\<5Iw dc9xցBg FD4oݛs{QA\%?Pb=`A;KH;Y%CyL@66DÒWJ+3G1bO55m3"i)soYBqDT k.馔Ҩق44OV R݁K0!(A Rc`J2sY͙S2˵ 73Jm^FW}So+ {eIr[0 bA7%pf6eiIz_OIUn*azFôՕvͥMVյ7UO2x#M6ԴLKK8@Ya3.8Egώfoώ.~HŕUAC.*r^ 47P&9rr|Ni͛ޥwZ[^vŕ g/.*VueQNm/,~/~na=6ƒ@\Mň86sX^@i4n1ZbǣUCOny9QK]R7fcRe?H)X|?S¯++GƋk8Vj-d/6-.pp'so?>+?_{Wxw/ҬjYB7U{WU@ߣ׊~uk˴ko{U7ŗ)w`[R|8rj b/B5'{Hpz k%ϊ`;5%AR`,2:r/{>JC󮇊ڠ~mDEE%MR qK"9;cq(J+fU9k㙍9@" AtK"uªd :;r,uה<ڧ;~FtYr}bɌ3+uJ;1'k$Y/Wt7$lLvSxƓY?vгz:ܐGm%722l@tT$e͗Qx "@";䩕 `X_0r 89A4oM 4 ArI  bR*:ƘQ_matCXW`L-6mG P](RT[ KD%+ ^돠 tiø&"QNb6!y.#FGhQ] ( F9$ĬEMb!Ke8*&0 ބb Of<9Ή6V\sOɕx屇2XPVi?Ύk$Ey;hkmH/uKr1~\$CR U _Ԕ(lSVOMw=~US]U9g4lv6@Ⱐb}=jI1z~g?p&.W,irȯ,E.ڐ^W;/pu\ adVmbggᨺmXjnN/'G6/r[jO o7I3d[p+)Rh!W{ի :單橵c]uW]}: j7\RB]#AÀ#UM9c?iJ|S|{I~~I?ՎڜqןEosyțFq63hp77ٵZ=~!܀6O9CcWF%϶Od7z.d+K [xC.w6[֍&hB0 # pdWMS|vE^Y'ȏ?gS^"R.R?tq}菛iǽv3I= {Ds98Ջ miQml&Kb?_o$^}d)~?RЎpkԡNZ)=h rW}{9ɪjÝ^ Wh|å_~^Kޟo IS /Tf}S= X^QUs?KT0u9?7T2y+Um<~jֶ❞ec59Gcz?@z[*tJ;s;_Nj vWkRtLqE1PLT)H ԊTރ03biEWPvE$'[0sBFXU ?eX&7i"@aL7QN}o4a:&lf|BՖlc'2efnT)rُYdΏ)bwM\cu>0A95B>tNR7DD:=0MG:HBQ\{v6s}t[f& w}mXnnƁyŽpLOxl9:W[n4v7ll)}_f"JOϷ}*нeN(5#H +dJV>ഡ( " MRS`%//P6E@w՘cDE5\HԀ%V'h ExN)(́>j1޸S=zq' i,p*8\ }-cM`LfArTMy5Pp V7:5iY<W) X4ߑI M2RVC' U>؉@q$ K)([_9W4 mNINy4.jg1)PD}X8mb B=VH_/5Ų'=ڹ<}ט X6j={;{'3YT[b=5$m 2FhO: Fܣ+PLOrvc"T!QDO+ime RLqxb uPQ 9$YgT%`騔!\zϲy&:EN9{lbQ#r;D5BÄZcJ%,PSFA-qUȐQȓ_ wz=LkF@z ǥ*Nm88Or8$X8AbP'=.~;W|?ro"Bd:j$2=7DMOnx鴴`LD|2^/Uf0WF QxBb>HꥧJS. {j_yD$㊱ι$cb,rW3{V@H(Љ3b1bAX|^k!!0c$1H'(' 0Vh;J'ecQ ^Navpޥߜ͐E~9dpJ6yd!T닷)~[+@$N]5DkǙ%E$Խ>V ;wWQ(.79:_TyF J=>XRxFLQSBLx`OL4&mMz~i^2t@f^]?fݲb}*枠@=Kz/x/ pQ5`kHDՎQ&1 }ѵr> P51Qʘ5$c,u%I2ќn 4hKEC*Fi&rVPHY"!("Jj[# Cٳn3'WQSīg'W\ۢmݭ]cMi ͜9 ؙSwVxGY3 5"6gcS2}j˪o+f9G2+ڸq5ofܹn[3/;yev~F[_]Q=x=r2Mc`zppnSoɟZC\hP|H3ƞNŴ'\s_N#rm \E*mmnb.{Q4Q'_PkAym9NNtqN:.IGkTt2 GSNR, 0r1e8{is,e{ϛd TkՎFx:E $$Fb<vҋOF/:9[ykW.o!+鶲 ScO0L0u/VA >^чz4ߛ$QJPnkm]mDk@xO0CG­5(52^Ox2tu̢a'.qŀ9=0XbTmA !.3Y`jD"ȵ6WL-Rȱezrz0ca(/u=GRq SD *yH9&,^ځWI_ f x4\YRm5N-┝<'q[=n`G!Qǝ x"Z"I *(I36N-=kM'6DB-ME ?ygLD-6a>Y26b0~Fø+}3GX%]ݕxg 빔~}?w7PxE 4WC|SMcqsktoꘆVlj`QZ']93U]pn~j.'i+=AUfwKP*{2; ɂ\~Ru 2^},gMvubIP{q{'xg`R pp^o漹 {m'䴝U@kY2ԜsW5]\KƦƀX `&(k )jlfqpH|S+͋m4/4$LZE|ſbWl2 䜝&]y憁_q?V,:r^MQR0IY9⹠^8IPܗ`8G׎ ݅%-z_Y8^j^FZz8ziys>iٸ/6_a [-r-zT]{w'kw67K$hUhlhQDɅN`)x:{:{'u2dg#X^f]I:r"N$*IiW ,3)N"d ICGnvwso[@Ggrrj!Cfie7 8G4DR1a*I Pģ rLY½C6{D7iI4|FV[\ )ZcKD.$OJwC(Fn g߱]>6u X\{u:'-v=8ٯ8y6Yqr.YVG[1(G:AzmvqU;K"niOp8/kyYw^?"/Om0ZATZ(VeGo@7PSTz#R8o*jHL(,R…XM;E!2aTn-Mh"gOoM#!WrjcöwɸDY}Q KC6cT+*@l7.)z+!n $_!sh+_~p9Eq͊/33%#Cdh 6 |nIJD(9ě%@/(\yl*@ؾ\}#rD?~Ҩ \*nŸgTڱ D"rH? W'n(Ӵ YYbV3b^´RRKgNTɤY+If#; RYAKR-"Pcc!D8)G'2M~M%8(eB٤TF׏b5r _HCpP?rAhC L0r$MUMFOLqQTI]' AYc4޵u+"l^7}NMz6pW[,)pb%g/"Mrq w7&]Wâo~3'3cQ74*Q&~5̊;7@qۛ_}__{x(g7i9G4{~}#p6v][Y=T[m痫wIڮ)kW|>Q3}v:a'8rYXgo&&Ra|s1إ_^v\V!1À@eG{uqFiM|{=+hU҈fR^ 2M!bJkhDNIOpX(^=!hM_=\G$<%,DF g*PA; FE48f2|cNsI^)40_k(&kAuf. oDlӝY*7zsjCw'tK7>\u}tH.wd۫!/#-xxcMtw۝ǖjlKylJ[HLJH"a4 ۤ}*godŴX*gH¯NzMs~<1j{@@H8JŤ""@@sDhАHoCk@ P q.@.T&Hg6Zks 3%7vm-t@7õiW9*7Np/{?%LT~o-y&8-}vPy4م{=_)?8B!e&W*V9B`I,6bx{ԑ" "VIڠK5I"$2ωkPw!Sa/!$MǛkU ǧEy7:|')9@P41$8F)ĥOL$АYdA3CMPg fU s,PKʊ g.VܯoNo-'ȯ=6wkɫyDXZ;P 3x_^xDsЊ o I>E u&Ieh8P)+6hؠ R2*2g8^V4V4#LhU&1)ʼ;W%uYA j"'uYo6;5:z\^9Up)w@4Qc:p+ \Pk!خb)]|׋uZ4]G жA}(*mb<_0Bx x^nXPM]p5D7r5}z4LC p_ wp;rƮ?R[g':~)C?wuv~oje s![|e(mN;6EO_#NJMzӈHdQKYEsa5rV'd׽a|"Q,ò$o4q6\bFu{; xMQTG)4v5gireH:ՠw_1@ܸ] 0R{{Xm0z=B6fWps\ohya1v[0Mdo7tp;b)Yq%A𝴾39>9ݼ=랞2LNd.H" ~$8m%{&sqP1 Bx'_^LR_9lcDE5\HԀ%T'h Ex>UzE^@Ϋ+&[x- ^Wq[R}ן~8wi}Q& p< ( UHDhp%u+Yu9W&xEs^)h!|x*B8u؀Fb5+7YCUX%r8PH5(Ǥ",3LG {gQi)]^xΎq 9@2cָD5BÄD9Ɣ$"=;eԢ߯ 2 .KKAO?D9ڀ&bqiSWu'^9grh,P!?mo}gQ#A"xRDLd]Lf c>̜IF+8hL4X`a Mh*FCA ˣ$hPq#X/DS)-#c s, ɡ,b"g />#KRDr|FN$_3  #!FhGc&Cc) ykt^Mmt!RUQ/ d $r" 6)l"w 6'h3D:c0D4 wI✻ !U9Ѧjܑ 7dj3K%f,vm[~|d&W7imBZrH2 ;e[|TWAH;y^uA KH!UpJL09u^+XG<1`-JFGυV()5-&3B5x`OZe]{09GI@n}}}=˫Cwi[v;-{&ljop]^t-gpQ`+HDUQ[c$}3rSc)OU 1`LD2&(aM4KAIL4>DvA% zRѳbQ4NScp@ ؕ3 RHF%'s(nb1{L7tѰT|3݌`xk d/k;4lƳVm=̮6.{Ѵn ơ誑@h;Ma2L`k˵Y\}5sM|WjnpEvn h!5\CƷ4lVln]уחRƫY' }|W AJS!&N &2#@$\k1_qt2u3_jD-(n习tEQaT6G 9F;!󨥈7v!)nA28%~Mr|iۅ]mk< # zo7I\oP 8L Uj]Qm$"*NYk5we%X0($2r8a@LxϴBz+.Q6eijzvg ŮY9־y&FSR˕_v+dL??{W֭/wceXtt[XܶgHBٵ&鯿ã[e2+ 9:!gCK `O!~6?H.d:Jqɥ'<8OF챼HAxbLzK.`DIE0Y'EԐS(/KI7EiTHR>I}|e+ZXme 柎ZvCɋ5 upEIkFE)K 6I6 lBרkr1c-Q!)\Wׂtκ.٧_u#bu72p'⮁ͥi]t5(&d9^XK|؟ՉpS|diEZHY) utd?$9h%W]%"zWR*VGާbӓ&D1'_ QJjmun4$c_[hB3Go9_\1!ÌA7.N&&Ol[┕!3 XLA|EPշVJl>(a~+K(gr[{Xu ̐Y$0&k];@Ic_DL$AL=l֝aOWx0El&ZD""VfcvE^PP %,;@IےAaImo`(ـɁ* +uD̤8fCo4Uwv=S2C.K( ]Q ]g(2]J< H8_N2f,keT@&{T#jhzzzO.y޽O &A bmIYKlHHB H {M y$kI~IPj5MA 1&ˆ%pE.zd2IvADjU۬;;F֬DѬpמ%g;ۛج99}}eZ6_~NQ Y̔dxN; :7 e*AG !AG]&Cx)Ư:bڲqY\[fA¦\_j IX#rdEVs^j< Dʇ49(e⩕u7\.:Kfǟ"e :*tɉ$UT! $_Y jiEb*[蕈ECKdP9-#z*[ղbܫ*Ey:w}Ɣr۸&ސ,GGէ=;=(Ji#7 LG]7Xܥ?.x8.T/9^6iyhq{=6?W~/|w1?YUpvCg/v>μ2>Yc{eEmż/$N׃;oAm-%07v[3jk3A\_YS6Gb]O]9~;rzmn{W&֭ΪH^HmذpzoX_~Q^8Grtp(ȎՇ:C-#8|QW_}WRW_W3p ^= xX:Q-Mtk=ᅴ3VZHZ4L T2M eP`<$V£0JҮn[zK7!1~@`_own#iM܋'A&bx Um S"!$<,dE" !YNTMt"z܊{B(v7A$_q8ӋI :4zw9WWwlShmwۦ~17OLOM4!E;R| e՝9Ww!kp+Pl1Qwunuu) @*'hjgr<>Wauq0&SӢxlDFT(/ZaPPWr9hO^*蒅ڹ\\Y1O]A5R c g mu[CǿWŻK ۋMu3.9Uf  ݼ牠OK7E:U`YQ]2 r'aPVLe.z DVUCЃt% R5>bV,}DLD S }i{"@Nւ L\(eT*GI01#f#ug?@ܻʜe'>~eezRn罥ŋ^8{n>]ZB$Y}ѭS&zt&0Hf7JGIj*!Ȋ$4!qn[4=`s҆lh>CP0Kd3|Q:!#+U!:tܭ7/ls)[_l.{W69"=v^10'bza3K^}S]St"sCtdpՁXR Dg"gH'򱛣<,Fq(jϾ|D0y|RHE'KNh#ߨ5Ld8$b„UJunvБLz$[u=!RNe`BNK"6?نӴj˸,Cnn|=&=rK巺(+Hf!!4tJhRr$9LH59fISnfdig:!]0)*d╯օN ^+KB_;cFzE)l9K& gЖHYsvZZĠYg]pC׮dKaGFᙎ)gEux!eS%E`+y~]AZ_/ v埊s,`I%eQ С2jh>9$OzZvA?j X!+S$*[#FOkh6z!8ZqPAd "vDcc0$XuN: #jd`KYSdI:/iԾ[Mv<ѳCQf"ʷF%LAJrmp ?V3`U"ǂeBR!XD`hk0RfvhV;脳[}1 +|KҭJU~2z QOV 9@Wj0~{썲K՟!Ȅq &h^C!OƝy2n!m5ec29f\]lDH)gf8O@ZEX`NL`3PĈ"UwS%KdwW{Lk~\('⬄4$#+-I9j }/5<\xp_޵ԧ ʇJ+"8)!B! c h߇%Y)gr\ΏM 4)v ӳah,I޻O%ʩ Å*` ,Jz-\ 2Ds}] 0zD "dY>4l× ¢B. ç9==F *%:hb[@Ip\'"s1T:"PK:1@C6_fIgxmQ娀|^ g;0]p=܍7kl#]l<׏6s-03D(\(b[COA$i,e lM9*6A*Y{)<4гzzyn#:ǪM(bRyo4jI^EX6_?f*G"ȓ,'7z-xO=Nf*8;C ( EBki9d!ج%z⓾שŚ9+(A26O"E5>@l' _*yw?OY\x׻sl]+θ]S~~ !܏X4W.㋋a^RЎM kꤕڃ<~/Ab+g5gEYs矿y3h~EFvt04yE NNx1NY遼CfNl8ٱTGo=ګ8#p?W|@\_II2{xK^pH&5 ,gḑt@9b'Rf0хA{E8) Wx#̒_ sBA>65|WL{a֝^בRuL2w$:ʎ~u=vC'M ;QeZ+l(RW<\o]T)rװ)WV|,┾:󷬮ԅUȶM^n{1xbfڸΉ{VUUNQ@>Á2?')U̫J]/\ۃn,oR]\$sOL_P4jl0?qkRpmFc:"GraDz{l<6ӥlG5pqa@5k]a~GaKg&6y0oDϛ.>7\hi,.tɲS֐bJȃ(a;mC uOnr~r³K~.sDᄎkGVȔV>ഡ( P%ȹMRST??0$| mp.Ŕ#*6$"\HD@U Cr[w'Ae >[ftݭ»w=.>ז\ē[n r4ytf"=("aa2} ˭uݢ >[[jﭞk^r<& -n~G is|Kqj]oz)_6ݵvz]cZPO ƞNv!k[?nTk^\7o*!BǗͻ> (keMu ZP^,qMII)—tqJIE'p4Eb8'%2di0qp!RI-972V; Tx㩎F&--Xi#1H +ņK|$~+s]]=+[8My`~=M-ASa3MN`ڍUGUqo{Iw 4Lh4f^p-Pml$򴕊}[WGt_gSQ{I}'e$r/הQAMH;1@Pz{Yx$$I*ޓ\TYs"Ȭ!`dYa( EZ+Q#^'|Is,he&\kԜN)}鴰}ƓY i)DPܷH`傡>=ImEi$'\(͸HJJ ܪKA')bq9"*#9ǐ:$NBOUcP'1f2E% A42vd,Uaa1ɸ/X(z,{(Efkr 'w +.emPha4|xF줘&%S.'!D<(s*'G JE٭>feFs \*/IvDPY#8LKouvvqXQ1I}Q[Fm٣v`7I|TQ~!.r/\*/T<ЩE-i,⺛%uWW!.QX9Aqp/ߵ\- G>Fp*[j2%d~c_ߺ1R Gt4;~g6_Ut:E5p?X)`@ռeFoC F֒82VեZGoZ0٬T9¶72߯82qr1j{eVu-n]͛W;([Vmoɀ7zz<ԾO{BT7qV͏R1''I|h~B_MQ pz~;Es17\͛6r|(JFָXtKYjky\^٘b^ ZЋ: h!=Z.!IBf0چHYvߪdՖeS n lD#5g;VGSfJk=eZ'TGNnTcT)G!qάʢn/:]p c{I\LFc2cÛ1NƣodL+TRmk_S i=ÄG<ÙW^_/`9TX] M("f_[b =N x@6DAۦݨ%v` bQV]Q@R@"deP{':B&oC@E L3yǩ`: )Nm7 hs^Ou@XK4im4jӡhA-@Gh⬭K;n,L$ frP%JI>@2)CebAEm!cQy7 s.CpYۢ!ˉTICXsB LB:iϢ;K4YTJ Ѐ,&y(mJUΪrneVGae5+II6 "Q}A׶?t,2 cƐ%`&kcx[}i5ʧ]v5[&V-ۆ B][tqtU  G4T4 0ځf1n I{vHB@qڑ&5Xh4v Ƹ7==>#|IeA][AIxS!9njlz(A3ds5h+'r).2TP(yDA0J$5@t-X ~a֛ۧQ8\%T~Gmܡ7\}OgaNJZͲvh`fUkrY}3`{o߱/{zj7+`J[k}6ңc+W0 aSWVt V:CR[KW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pY+is`\\^x jn$w\W\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEgYp9|+kW/;XW#kW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEW\pEgL ҜpeL_uh>3+1go_mKYe7w%p_u)#vИ4+~cd|~\ϮojxߔyX#7|pwj ;B)76W_MWT:nu^^]xlFy=vl6}us_%6i_lii˫d=]Q4hs)OYkhe|b|"hVs ӚVR#k^u@s{.viU=21-;Gz [AV6 %z6% Ŷ}Gd?s:ێ+8RΤjcf0[f7012iUNpd{OoVWc|?%?~2͞7pcvF9`ΗpoqD/V8~wq3o.>XoǎS!S;6SYc;_alD iJqܶt5ˬxg t\*7-wVּiOo_}fΡNea6KyZ4[GVwl>[aߦr줲}ϔB4/?"vO@_6:}pG~=6?߲ю~_]Ě^[1v{^av=߬?OPvx'~h/m.ͯaoiq|g9wH F {Lkb&AFlwwIoY .ݿqo9ԛ3ӐzYz\%ʞԟ"\M.2eG+8ShbmIN:UlQ*Ip?,lfjr|'Oop}'0-Ov4WZκ+B /)ۊSvJ![*7󠹫 zeu+o?ŇS'r8pwH6w;s#׋_,iu+,s~ŷ/tw?s{Z?{F/ lټCo{}` 5eǒ$~KKJRV-K"<OKs8yn[Aۉ3eS [7[=aYjW(;^ϵCa̭͝T-\m۬ Y[#Z MPl2K؞'pr$]IW'Aҭ&U9adJ-<{J DB"h+jMz:Onoyj,\qv r{jN\0L0uVALW渳:Kհu_ru,סwpju&z5ч,:_U}kiZpYW/N+"8k(vl=y&~F:^ߐO,WJ;nGȏSc4h`A:to+HKR4AcL{*H.-=yb4b"'"xbz*Sr"$6Dm>#%$nDfEGQVEGsypJy.$ɿI\i>_ADk6VVFat4PѲD}3M2*òi<,JKIH*4pFwB՝b>k}Hv̞c,uxK]* ROf¤ G4HŅP'NJ^NlB`8|?=;y>'dςȭOV/ jq AI [R*YV* 6(a["@|9 6֝4tOCO 5iJJ~5|vJbٻ>zL Ql.x\⧋ )1֠i}!Xc "g"?(<(L16Uׂی>p )e$Z.Lmgn@B[e#x e͂BDlR $S$)% Z#Hd E^7Tf`Nj'S0l%V 䘀 !ZP1,՝ 5m9~~+gg,*O"'kZئ2-;s}6}ub(riqԉpUDXA*$NL4Pt"IobN:9auAsTՌښchژ%z$wGȧhs_sgRR5c֌J5]XM2Յ.T.<.\.0+ȸ ''8:< ?]ߏOl5v6B#Q:Ǭp&3cS)Brn[ׇGZIc*dGdB66!hF8q;f]D錉ilZ٭8PSդPm*kmknTI~ѱ$j^RDc@/G eEgXLuaVx Vq LhŒ-)4zzÇUdH:+j٭[Rp+T4b58T#׈[6$% Ff'L 9eZF>D( BIpn#ȴI䎐рEZ|K{"`arR vZ#"TҋE%SuV"bI|`e݃Hۺd v PI`4*rkbzzTa58TpU*bo* ~|&Gl<^v=d~<9;4E2aNIxҩN#?joN>FJng(J"ϲEӒ%n}*(85 IY>5w`7$Ž%-?v/20oD7]Wz}Pn>>az@"*[mVEƍJ,t88LIdYdq>lN'ǹvΓڐmr fscPE>C֝+x#BnÃov}uܷ͜b0yE,Y:,믬Uh6Թ<( 5 M<$*mI~Y99 ]VR{:5n( s|$o (L ˆ׃q1K4\ nSN4<^bF/a2>75,>im._lg7uO2x= MspR>p-ba3.8agoKrtzc?Bu~>n|HD#u"_8^ݬl*u5G͍ypJ`Wfۚ67^ ղ3WpG÷alW $$jm~QjΖ@mn骫ٌUayE 1uh(z<CëtÑlU.ծg54llb/׷Xn}7!|zv+75vM5g7gpxsA?/?~}ϯpa߼o^:&ᬫ V|uu_сϚw4֦[5ĥogU׃>4N)}\Trʉ (:dq9UKk b@s9s1ا]Y;qRتK0 s9:v\#/Ip骾ċ%.HBRtbġ1:Bŵw^'erԏySʥiWϧS4:x4iyV(Iy\ -@yL"\lzDŽNsIݜLu-8(<ƼUè;(㡧;'̳X\y EU]@@D(^ߺ JɣAefZm hU#<ӍV6Yha~,ܷ.y*yDѫ9RIR#,ZJz1CIS$Ez͡nmzc-tJU3t|LH2otk&4ޕ=a\-~ʡf ֢6o=y&#]r^ oӻe㵋gEj4fMGФ+GM("O.ؚA'y\sӱ|KcPj#)*Zt@WIf$ qBFyv1%*GInJ}~1 fwqvǛav!NZλϷM~+jxבa!("h*hHȵtL,3?{Ƒ~,6#A8 N6]'{f 62%Eۺ}=3(J#Qx8L`iTwW}*N]wǒ2 /wlO {6[c9ȲHITiGL4Ċd'<7?dE*D<'eidlydڨ5/1: wjOH1p. *a? _ov;Fn"n=AMg^J/M *4]T8VϞUJ0M-0ٵBhmL ^NgbϓOߕ-,=wߎ/#OnVXE7߅&y`erJBl/F/0u;~n.YzNqg$+Ly:+K,듷IB_yI5A(QĨ}ˀu.??;^Mg2oޗInzxx63 ?UʎF<93T[Mupq-j˵D0kf`IY"իll/lvu:zQgI6)h:*}=Xc¦7HnTʝW.Fe '_I_5l=DVw!җs[V~ȶML-nn{-Di<,ޟ>8b8FJ*zsHIBtZzu]'ey}/i߾˯v"]Tsxt (!= (_:Kmny^5l߷Ur2:75 F}P0p?E:XL} |"pǍNITRfxTD8/T ,CX6ʼn5e@h;Bi6:' z UL;Az>̿A̿阮 ^7ogӭOARhghqg' 0 +;OތP\z]Se^*QltZ涨!v݀z#I%@Z~7|iS!A(Y.{A:3qP\ k{i2Z ASV dL|LMr5e]FFZ!uo~u`\e9 ̓Dç୍1AlTSs^l&&Shwh{q~.,6SU7T}x&9e!J:$7(n+odacvoJWx;/u!D¾=/\ {?l> ì];o(ƮWKi,-KUfmi36Oݻ[[m`>Χ0} W Tqvɫz9kUÛ\_dDYFfAZ(kG aK$PouW]omK5 xvrcmr%h2S'q^Ab{5¶j/ k9:9G wqIc^i\NDDH񚝚\`?.5\tkG *eAvjUͥ|IqbA }m}^~Y;omIF6j#˟WjvlTHe7 :/*8jCҎJ/g+Ҷ%lCZ'jJJv3t88+Eck]bn$;M~\&X i3ݳ{uYӳR=;[Oj#9iz}60n/ӨLmkB1†rhU^nw*wucka V6h/5v6i|kJ;?6c^9UgVa['$WKߡDB:gq'hr&u60_irJWgG2'ȸu""[{md>u2_e-5s]Fe[^եv~{5}vGD;-e*5ɿ0U5eVn|=T`*q0ǸW`Q !>;/Xr/Q*>Zmv%5n"iA]h+!cn~m-G6tu&$ %F9pvK {']ӫ-I^O׫}ZKAkD 6]Dp1b/vpnqo(}1M#]M#pK76aΖo!bA^݌iŭRBp껶BJg_0>y:Ɇ3ˍ1m7?(ɷ]"VW츶,]{jn -vqV}@t-hEr 0-=r@xFy25JyE%7H&DŽ&8Ι49QextFV(N|+8u\iOWv;z mQۆE߸>i?uu$+rS,;4hE2k]27['9@xܑ\PwF*\z^[$^Ji).pN\$\R;# 0­%WxGX;j&J1HI͓tIoVXC 3B INɥeoM)ba2 #Z)ϑJ9^j1[}(5.x)g)C+%hY $S=,)2Q!Z$X]L;]$D%!X̘fؘYe17sA;GZk `fa/R2 ?B}cW 5$qph9szT@ew1>$,4Jrs6:«V`1)0P#Hciwn6\HW\P@Y#AQ \;8~kÀ'dwI);n@TI%nt9JM2drpf#Hk!Qk`̘]gUwu gZgPs$;,HQ[J!H,*Ջ Q;CPRb),F2J q ^+pQ.K``YJhfP! ڑ]p %z-& ʈЗwaӪ={F erHb-#e,:`[V*Rmx8deKT8SRyTa pL4d1:6d+nU}BuP4(V@ڬQ kmƲlgl G 0 6ɗ$϶m#"}KO*Yn;ȫ#.{!a%gH!Zی,0H8u X Ұ>"TB`uS s(E6ZmRAN>' (MNh0BE}Κ!-'-ju,:I&3USBYsTeJ8eЄ\zAZ\!w˥rXD/K@B]NvhO P V40[mA J <(Z@mQUJGbU>"xk 9[`&f]D`P*zETY]Mp`Rp 3J` X[@f$]- pl+Yɩw ΠDC4GRF5`Ԡ=AvRP6ZRFBjjR1.-eeA)bq@!i,g5qZ/;֮{1/Yb吜$0_e!** a:q"0 Ӣfغ3A.kQ]~ר6Ez|eB$(r`bVfsb=xn3J;a:2UIoSEߦbD@JW@yld"'~cR m@r PI"DL䴤yICrLUc4l=zD`|=%jX;7dd|X:3(m=4G+b3A׀MxY N; opƷۧf)1,dkcCX'X.z+2VUf3j1byN PH]L.M63hI^qY]( @?xZB 7-jHp9 X0aa^g(_~+ )d%Z3- 0 =xf, yѠ6*%U[oeĽ",ZӍ7(̀Ia $(5@=Ee(!etqH*p+/h .smnY|*-ZWM/"c`aVMV9JE`${]ŒC't\&!K#֘:se۪"!w* "ږ1g,ߎdS&<uQ47u)`3T >,`h"(~d|znBwY_~ΖO,KJim ѯk,OovG -;xvܪaϋxS>oe_IԻj=5|bswG+Xϼ$U//Gpsx1 VJ\)WN)gHpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" UpoK\9蚠!W0W#={t$3 *W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\1W#I$+bWܖUV [oe#՟Hp`% HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$"գ ${I+Lpsz1++X)Z\Wh\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpE+\W$" HpEW7 ƙ[ F9:q\+zk;C0,u 7@ci7@ڴ-Fnu9U$t6G30yŤ(T 8:G^YWzqMt<<'+:8kq2Z#`ezc8_7' y1!<107f6躥S7T3^ #銰6sI'WN_y쥷 !]["~hnu>(\pUrRr\he\ZnF7Z;i)F iik05_ލ..9*^FeTu_c궫QF_4Y+\k48· cz]2"!d Q&Z+]-} ? _v둭ˣրĀx6༥Y(e2Y@_=;Kf0}5ӿO&.5c3)@WW0{u/o޶>wn_6o]hʌ3lBF7礵2Vr|cD#7x# "A@@:=ʟࠣg] f1R%*횷] F7ŝ!=mdU ?UiTd :8$D0^e~J6 ϐlk7Å/$UM!7jc_knINnZy@Rt@? 0Ĭ 8+J9fp ?ŗ]GcN<>y'N_blLFUi) X]ǯ6{;?]J _nfz廿n;dY6:~mUdN9uq9o\͊eWX6L o ^s[x!bdq4YfS7JףҾǴNX U}Aw{XWg_>8aqziz16"l6M^l,[Mrpc7ܫص0?Am.A!W{AW} ^nJkf_+)>֗lTrT_^tۺvMuǸZI\LӇ2L*=m7GCɕw+ :r\nѕA9i~?($kcY plQN'G/G}oAmZ3VGks.pU 2F: \)Ӎ)A 3vp8]}VHm"6V X}?5w0N\EE$tbrfdB1Wf1 LG*L]f|9&L%z#]ek:p/?(<b͐Fo+ƌ}vLt"#~REeKwx}sc;~Gėvms?*7PUF;Uj&LJydz .1K&*xsȒzZdڞ磱_ծci߮6tf|yvs\8[Hg!u/A4pGs\WSTTszutt.A[{~16?ߍ'ÿͪ+ ! G+Y_ՕwFB/5+y lxſ_UcfC[wy-OKwͶ\mlw@'ǃz-yPPow]ǃ! }7F'}^yC׸Oþ5E{[ekv!?Ma%[.v;3-\e]dI\*#G<%w c;9Ga4Oo "B=bL^wg]lz2U֯u|NOj f@O//N}ʿ)]Tlmc-ύkƣv} %v>Ȋt3be,G56f3 Td(.3PWpV/e)*i򯰯vI:7]1ZYoKطp4sU0r~ ^})}駋ɨoo~ZW/>'}]U7hfzvoqVMM)ZGo5ztF5m6LW _[׻RYe!npuzuzѝ-7]_lfwC\f魇v_޸op+t2sշd6m>nw+sy[n]N|Vץ\sեo~[=usS/s!]X\Y^P+W\8_?jqՕ=Y-ڝ.s‚][,y"2AwpZZ~@Ֆt>6㿻CwSg˂1Z /M]Xe,_eGZ|YP=et?-/Y[LmсCL#je&^֗=}"w_||!^rϾMW?n7VRKO#8=mmV6'gCܔ-TOme*N#ki SҗV2)mwGgZ8_Aev-UVf1Ƴv2\ơ8' "H"(fwuufe㉋Ks ]wڑ<1tw 狵]xֺ;:rw-NM,Cȅ C5?סJ#mpx@v<o{bTb]W5Ċ[gt &DFnљl.VێƀF41a%j UgEj`%2IԼT5/SX{wM,pB]߀"E ~moFWyW9 7E/1 birsb^8[ûY ƥ\Rj qu MgyD&<@_=ԹXNiPk $S'Uue +NGcbTĨ-&`(Y$6٧UI )LI_Lb$ctPE-",B`BeObG@)BbLi4056S>.Dk8+jy{?,Tl/nZ:9jd'OV5Tf2XdO+{Rہ], U6\@Yi( Q `t[ː 6d_`Ne՜+jUͬ"h1=%PQNNKRl` ,FCm ~t$c_[h;B3rtg ob>cZ"7I}' q6;0~lVLXv\)aL: *cmPXU2٨VG{*Ŭ2klu[Np;CB)ؚ [syB1jw}lhG{7*b:ʙBςf9E1!EU["_T.}a5 C-r񹅿 -KHc_DdTcg{m8+ XnE--hdn=WQyMTvkhpYmJ1inqmN;*Hq1fG6;â|K {S& Jy^١9QWbhMJbhxVM)$z$S03&mCNLH L0+G3h.=&Cgu,$U񪻂G~: 6˚n~5yz0ZFߣsͯȺbCvn,rikuY8mN٢;`mQryH2)xD#JE(MG~Vx\}\ T\mDdvPʊ걕u7'\n؂O1uuNSEHNgkQEi4H1.VQV"R;g`bd/Ze'E,S7*R_E'(ƛBS_: $h52l >4jb=(FcXƍn1oϿx5BϦv1h͛yޖrZĒL~8w YU=,&ZO] GV>*n'ã#PѯƦCUl}Ox!)4(@_oEL/bQRec߽U%HVM! y*K+`\RΗۇpGiz߾iRT6|fvvsɓ8+w$ ,Vիˮ'ЀLzv:Knyq,8w)(3Y+ZFK(rN=ٓͽ&Wn:[Cޚ`ߧ:=Z-˹ Qt\%~}ɝImw ~~n&6WgQ\+:=>^L'fr]۞ɴ]_.t؈4K-~I09^|bEVdMÓ4=9q/{w_d$Y^l_< [럶;obYӴ1䧳߿Lm;}\)_7)7y/DF.~E=_bۄ5!2 F\ v/|-KHʄxN;epjɑItφ'AW35)Uc2Ɠ-J8sPM2`c%0p0.J{8DÁdC$P" lC%EG.{E@9{ ^W6ǒ}.:<;9D^2W{&joW}wwZ)`C$E BEYIOCV9Q`Ȟ?6r P &L63%?ʠs̎ϓGe>Yym5 ?x1\k䕝o{w~~7[a9ux6u_t2zCޝK Ri R]L^NOMCɼhb")R v@Q6)0/PW6 9Tu.`- +R  =A !i-lpZ~ꛠi)mpt ̓S72GO.4Qf ],+$t"8BG><@ d8OD@7~'=RjMX́0Cᡘ+E# F;뱃wFWJ:-QU,`Q9u4lL5k}ھ=krXJKiVKNN>ħz)еOygˣ9;R/ :Pn6ћJnZ0ՐSUYabҷx `*{`Cb6/c\h_K6bI.m8k8?=>^m!ecOVUżSdn#|$fIA}#srsrہNF"WQ +;q0S+'%FB_[{(6\ C҈^WA[0jdʴꈘ"\9F:Xӱd*8PbB*(ŘJ(!*@Ea\^ܷBa/*\CeLrI5;b@+@E!9}2u7w*X0D,ŴGd@rAm#zWyTc.e2eTPëT˱B#(a$༫ňDlE![L!'f:'j{Fc{8]BUݏM@-A除 ]rU;3cs+9Yr[r/כ'⁍!נLtK䌮O ͽ2P\o{pq& \tPKJd uƶ.Dbfa6Z[˱K;G% Ak/%p Z;](86C Jn ~l!!݅25nk?xIe=S^-[ZoEWlY0] iDd$N*9-@ F jFw9 ~d90P"cMMm;ж17A>%j&BQ1UmL[5Ko2NOylW@\$&l$GV+2,s=Q-PՀ

4l "LSD|G@&T蠉oE%q̙P j*H\D | jR5(ڪM(bRyo4jI^EX6_ "ohpyZA,[?|r㌯3'9앳Z+rgD%0<CB8jZZ:Bݪom`XSwz]Xfw?0n;X1ْH tYev;_)9fT OS]<;NUEv5_4^A. 8_MA(>-RЎepKʤڃLZ؀f ,[?~A)ø7zSSȆvhsMN {@_Z0zݻ+9eyhzͳ~x1Vq3+wyk+N`2T5 \ǢHA^OGgf{3s׸8sOqpY%3'd2mz6km^! %ܐ0fPvkke s&[|yA%dL;+lRѕ"'/םVJȦ\]iDIdQKy-sJ]XlꄘU3uNഡ( P%(s S|\ vcD:Dd Q X§8`9ϧVv e<8l^n_rZ[ q6Rl=qbKI }1 uѫQ1Up?m1$\ xW bT=bP %xl2x Yɴ1>/X{ 5w}@ c|j@ܨlQoҫQ}isfo^dP-ns7I5mD;(zY)X}R$ތϵ4<9.EQ^:rs-OpS;W&Y (ךY +Ǽt>8a9z3M/c,vq-x>?LMeǛ]&hɐ0ےyAK2JT(66/5J!`9C=ii?wAı,j < ^x.%TF(j 2s KIB81 6ABPxF|9?SRs-[#gyЇt@v9.h-D<L@1.֨9RDjk0$ߙ*'(\Tڬd_O"8Oj6d~eau6߂AgkYaKuKtk|Vo?q_ 9"EnMPUoXVeWLm\>DsݼRBrhK8e=kcP'1faJn)`T9ۑ=Y5,l3B2 "Kk\Q$˩ Pp~8>qNIZr8PB:xn}Jă҈J$8jU^P(nW9cVl`;8˰ɥAP#$2|D2n-r#UaE<n;Em2j; vccT!.r/pdoT s>8oCN-M6::$iA5d\ePBvQCR{k1 XģRm;vklZ_܄x0}QC7jniU7Q0\ g8`.m1 uG"npjP<(`)ŧuIyޣ&@I3eDlXqqqHָdO\e\4.v [dQr0pp%$F hqǾx(Cg-`\(ƍv ~lv~D!d ĩU,dق#$q#<K9+M#9BJO:BByhƜ):JD!1.(M"UQPk, ˟sQR^߬u!s1c%&ڠ9y^&$rʕRBx)u Fj spaB¶t?yK3ƇGR۬t&NweQٺXM'׿H\ zFP.I Pu\Rq^,ԇpR!kq{Z? ΙڏX0R,y 'qJP}i 8 ^xÃyiU&%y.4 {HCp V[8QȄQ9PT6(W,FΎMF]^z劥3S2@y9E]}z?ss!HfZh/pY6%T\`尔\%$mP<-Om9$q+Wx,"=Svee JJSV*\BZAf(dܓX&!V=N>bݍAk4NQ7#)v,x" mR(;(clvOmM;"EI &j5#6qG BV9F3ZqUɤY\WFͼG)[H.}b@fN1Q$@H3RpDp"C/:%I4 Ä-F_Ӎnpbo/KKClrCh.ИboW`YQg-&1ID$)HsO+0Fe/R65,JjKFy]qo&6sB8vi T22QB9acr|$c;s7<ϏДŧ$Z@J8jQy@(OwkKr%khu\:2*rc0Ҝ1Y,ƶ(>]|W#Qdv>~ -ȇB~Z嵌7^ $UQ& +NIJjŵw.ټ_k3OMɷuWgĀbI̹cP[nv@V5gZ0VOu='wtnVu9kQQ8yN>^986^%]"B4by}3/_Ȗ' FШDilP5ŅOϞ"|Uou2o]_8qc8^Gf& Ay_km#YS`TCn6lMP}$,ۋ%s$nI4 3]UUuW޺Fෳ*e)&"mO6_vL9j A d{u]/Z/H wυ\`ʚqoWj~%}lM% qTj':Kmu4U~nx"b gT\d!RtbC1ctIP'29Uz?ҴGN&-*pF0 ! v64`e2(IpF+24HBSe_8VzP9tMt]yv>uqRw8j};';"C%:H~,+I7kbp1Z '킶 mt̵6!1d8H\GKPbyw.jfB::!TN\;$fL9r~az>Ai0gi2XR^EM/s7W_owŤ!4_ E?R0ӅMc`b\%8-(c]U-g)!#A**{/j3y=--{! ㉪QĄBpƣ <@} ?(}{ﳤRF!7pNĤƤl!@)qxP o@0~P˭w ӡӵ:o$)E'wu>g_ 3 6 `v/=y$c8ytKa8U`}x |]c`&52X$ H˓o*[N6JàZAXddA YAd[>W*+ɔ:Ae[gs0dOar|&RZd6}XnY;!e<_EPǹz8RD^rѵ XqnJ'#!,LObG\;d6$MNOܓ6'HJM_m8wkN ϧeJ)desO+z%z3CŎ%vX^DG AsQIh"GK0z9x5@Z9+I|10"xPɁ{Նs7*6!QeN^2=oʎeQ0G^)Gc=ȔF!R"`)u) L15VM)3nҤM.ZȍdV7#GǕ$-tU YFiՎ`І qLDY6%cI2tUΞzv|?ۨFnT|,uz쒘KM)+H0ÑYrDŽѬD.43GVX ҽ TE׆z;2Z'̠E+$'9@/9OZՠVXIy'=.~;-$!28ZK@i UNh]<7B6[8edb۫&τml#"")eRrtIь<8ed fg.&:pcT=R &zq,WMDU>WY624&"Gvln%V2z$pN3ǜիG=cM>vTC񬳺:>_5MvwDMInl[ QP6 }#%Pw1jINӲ"2ktqs#scy;.cm l{; lo@cŶGZ #IЯsw5R}gEf;5IQ)0VT6ytm߂6su~q}=wPNөh^If$"qxʡt1Sz{kc"Bnȸi/޴=8{7jmhS8>3 {%202`cDF\ :&K"H} >f"܋E(|l dTo^aPHl^ UO1mŝU\d6Z `4xi3ӥ9Au,utY䱄 ]4Fj J x P@Ixl n0`HgqaAH>푰7=-nw6P3(37s3m*1I B³@Jk #k΃jZތ?Xyg~JN3? 3Nb'Ilvx7,yW \o4L?cKZ,Ì>^м!}GiڇѬ8^ n^):r^,L~^R짫_~]e~Θ xsYᛋm.'8"e4@/kX+3 r4~8]%Od{:.0l𯲙d;;HWfmkC˲ϏaY@SꦜKR͟?φ _Q4x7c.o}*dM;xEt~hL`d{x=kVjM'w4?: EɰVӼ5Vg8FVOF!(K^A j0yta&<|apt@{RAXt^0ky)̙y:TdY+vr:m&rWG\vF#ܺ'.{X aۃ[CY_@H"OW 0_k\[9D6WI>o`o9r!Wmbx>|R$,)+Kg#n ;.- o.sûf@`}2כ$ݗn,̥<ʻ}R$3jh?|D*mEo0->Ѻ>wPp4rqc:kv; ݭwN&;t# -Z7!Xenzvuś*)Dz6,Vbbmy[n ovղS7J(;cCƆonrg]%MZZ20(T:g̚YqdF#jan6T /,'>Js ^9n̤HKe 3(& (DO Z1Yv'QF+isT^^UigpuΊl樓D/T,aR<R]f0c f]]砻Ьu:MnkEk(#7N&Ϳ!XLiٴf6>?M<@\s B$\wMJ›d{GY)XO#=sLr6BD2e@Ǔ.BL9&2Zpk%Fسa=Ryw{Mzl't tf\~,D׶X{8PO0cU}|L_5Jy IApۀ@fC-Z`YҬcYWR@R?0[΀J\J9jW ,/  p Yb]UYoK)zǀ;,Fg2'D1{tx+ 7-́sI岁ƹhsȽh4\g^J̣d i[nsvE+cCodGR6I[hyFdZzdЖ\f*giJeqj. .bc:)QDOsuc-Oޓ\W23HZ`$'xduZ\S$l+U7lQlI6-C-vwzG87a5E. ~νVz k~e'27f8/&F852AMVY=;P.AQm~4}0~K)% d\-WXğgǑ<3 ePxRHj~fZM(4PnZ)vQ3 n}k`: {MYͻͮ>.c'np{b ermi#J6CYVˡkxovie|dY9)Ueb[K]覷 7g[v>yrE koLJpf޷y`/z npNڮtHfն`fw)N y: {O8f9ysO[a|,3DZVsJjcT;3+i4Z'ہ\n]n:q]nJEGӘBP!<{d 然֒xtvRNq,z̝vX-"ro0CJf/hrt(7}7U_m?ʪOby!;YWkNjjj*[W`ksWI_qq 6UjQGށ*4A Sq@ 1M@rR[W[1?T Hzk}E.RIIv~xk{3ѱ&uu4.'VV(?6+ <e`n<ة%a4UTsc%bDPo#`A ˬ{1L.Vt5졀0찈1&w c#usYXP]2=0ޫSQ`aem6Ck wrkWN,e덝 E>ג)E 02SCZRjsfȵ1OUGL0&!+Zq0ghY0?P*I_eE;8_g?ZmBEO.\"ͼQAA!+[o´u^Oq2Β88/} 쇙Uʚik Cm?d[pp0#@DsDG^}T+Ͻ8_:b6%3llfN2L8I6Ji.ӄ85[3 dy<\c00]o䇳uI`Ư\CxҶB e۽Cv=Xx~R/5G;`^jq/5LަlӦ ^觉Ѧ~o2͆1+CKA'7zFZnbnM07ftWj9jU@%19HW[[oL0@ 39:m|֌V} mj5oy?gˊ]\V^GZ1+d5˟cFV(]Ӌ}nĻ_%tq4Do+?cǃJ/B WPRKZ>ۅL 王1ܶ{Nb<k ZI _; uT=KXRiZ7~I NmyK JMj3ZUj7GvP+AiB~{(q7 sR6s76 ɚô:f:շ26k8Μ=`,L~Ԇ6x:izM#tqc.K)'x |.W+j\I'ۼ"+Dc .\kk3DZ[: FRZ!.Lk[Zmihƒ(|ӒHhDw) z'ITW]RRn O!:?pP%99)V~[iuiȜ%5rdbxۓ{V fE-6`<krTar5Vj\sWŕU¢0(0߆?HxYWV,M7އl^mA6x{;-*69Ci&gb|L7MӐi4эԑayWd1g7]9`~jM6U±T.T4&> $ T9lXV{ҠRTN/ Ǚ덮o^|WxˏW/^D]˫w?jZBYOc% x0|0SVV~UJ/_^ 㻗ÛQ^K3n>W`{7@oMqWPUl3,8A]bm]na޿.K^ )#)5lő kMle'wn'm?Zl$8WL6{4caA4ךs 1:Bi[0w(( mQg$^!@)iDL^{IގD :É{ /gsR|jfzszʍŮ6^PNה["tZqW;{,*GaR(uɡlݫeڥ)h]16lR c g^ wE%KЊUvMZ=SJX-CP=u)`6dab>|ʢI~Cmeo!|,՛@_`yР!lvEB^$e}N{,9BTP 1ziȆ%'1T*49@ (hm> Ѭ?(P?ȉR4gKro+Ի{/vSv>x]&ozQ.3lDz'͹BINFBuٱ[;wQlD|deTJL'br< >t*6a97=9L#c!bvEܠ DaEaVF]wJ * Q- 0!',urIK !!G Ơ+帓 1tm T8Hi7*>i%< JFcDԠyÉXA,)UJjč,sr=T`qGzq- כLX'89e#$z.cqmD[&1* ^J|J2-JD+T4hiB$ Lc,-g!)f㺾l9uC"C _zqViR3eL-DꁒP͑;˹,!D0Kjql4^%CQ:vJ̅&FXxA 1"J;1x# VT>AF.zȩoE[C~awSfp# cdRfrZmmXA-7OzYvo 9H倣Ozh<7@98i ⽏p^6x#l{!*+"B$^he3RAzGL0@]N8,c7Ǣ$AQ^xY]^iauf|bGAiRMm$RJ'P &Q/V{tggSsyk^/cm l{۝Ů}@(~gvu !-Tדr9_mo/s )vp*Srm(Y=-xdCƮk=85 xmzjZ{kZ(/GxVd-'!0Lyo)n oD"$h] i00(42i4Zc  Qcxg-[<`OFP7.QbV%9H$1|`_co pby\7n;>0pxIk4S :Fw`, |+a(hc>ZM돐V;+} 'ܞ 4L ̄rmO]7(`b.Fi'"ZQm2Lz5Xs(&'m홟m~F 3&>̆jYgpFjr4=~ \Y5\50~)ṸB,׷h73^CqZoŷiř4_H)YAi?/n2wC0 G.b߹2)KKئd0٥eLQIcJCi2şmuRX6lφS??;2#LyD@3-NAN߸y b$MNOmЏͨ?qf2l=KU h?,M_z=R+rs]g;NVtW$7BW\l:4rx6A^fgf$΀KV2yi-TV0 z+$Y\?̫I/l2ӛ:ImG)oy|SˬV8]J{H =T]/w n._.*%Z7lLJN# ,meq.C.bb$L)aS$j[\uւ.Ζb 9keYXTҳ\A7Ys)6]~Of_H4ۊl?lqWao`ݸ er1jJݽCImwl-6<`qL2&m[a)ocZ꥘"9csB]`'N@J(! "G줏 =56\rr}٧n.qY\AE ㆋᘒ#oIWskυ}'K>$r2ː\_ 1ZdhX6ǸגZ\$$Q,OE,cZ!N,F/IQe'd7Oךz8JMG~/V~P]/`֪LUw1 H( sWaz St&WaDKJyL(R ,$Eh-` H4pHZ.NUʠoy}}VH%璤XFh, !bLWп%^`+FPt%L/ѱ/WA&n@܍ uN 9]XP"F5zm=9FIcR%c6(SZ8SM܈[w6xC =:d4jxZ} P&kϮWds}]plYUm;E=d nm;:^.t_.ȳr)LK4n|-iЂx焼WjZ[qE |&y6Δ3TsJjcT?3_*i4Z}@!7~!7vtܞ4!7 VDnq #iZ!GFb DhZ.oB +7‭iU"bK(F[ :aFhNP!l٧tac@jy|vꓯ|![o}n ajʶfxx hYogmk=_/u`1LMI%4XjS䖥1{ؚb~Dpꈞ@<~r#;z:ק%cKLptg}JX4ﮰrSkʄ2X &pPE57V"FMsFV;|9^oI^8gw|6@ۻ0/HrceɆ؃WX*C* 2,%2}x_8;8E.0g n'NX1Ip/V, Br)1A*VƼCJb 8^j4vCP q`:-RLp79tm8UXW|+jQ!8E$gQ\x ϕsQZg[ov-Ud`ZS_euhҢPx"%)~(m!fRM?Uj6/QX_Yqoz´$kXW]û4wwݱkpE[V㑻Y-Ĺ,sZy4d$=/XyZ5,^)|qWl11W0a2+=^z; WIEy<(lVAʦ&M  .f],n婢;WR Vof x%:.OrE}>^:`I6՜NKzPDP==Ÿht0hiScrmM'c~(v*Ya;9o&7Ifk{TR!rۣ]~V[$x+0n v"L(Bhp#*Õ 93W0e{$\#S‰Sh7]`Tz5%nlZ*4R@͌`qzkRp ? 7%'y}J7%XN/{p>`1o']ef]kFE"/ 0A0F0eEP,֎g1}[JDXlI[}UoB?~jk_ZSyv;U]>`;h{?ӓWG[8mzW.y>OSyk-۾|7mOǞM79Hm@³I#u%[~8֎:~zS1VnKV{"\՛|F] c;cGf~OXemwwӭOtk `}ǛGPFdy7ի_Ѩʵs+ =X7ev}_hD١/f7jh!]L}ݺ"gv58z'B~qޓu΋, l2k:ƤR"q}vi,Ы+K؇Anyx5n&{?O~~ɌHv xt*xb=}҂R7Gެ+blK^:Z[4+kV쒬K_Dbkyb_&H2uئs`)uB it5f<9NYrVCIs*~!oQ;DC)cGvt'Ӄ).6piqxb6+uٰPCXlXu ɡ<i=9iTX%m 0hw)A_K&̹;PG붗 Bi~W<{ŽN:^qzRn**}s/47#I!I5$>+i\(Kh!^k/Ōf:!Y@OQEdh"i52'ctldf8LiP, ?aG+kON3^hy77̓yjs_FmMчKgu=D؍rLm,*ֻeĘ[چؓpL).D \h33)w 9tm03ٷ:fs7bÃzIb jO󎇢v m?*W)ip*6mg֢xYLзԥéa!䉡ePR\Ge(4j B4 Hױ#2(2s7EY;㡈&#bzB'DUJ[2 ;ڳtL5-DShyv4!2q(m8S|%]KPҖ#&޹2=_5Lu.jإi^@\$ p6/f=gH¾Zڐ5;6| E? 9EћNO1pqWpw<\<$~~곸 r&ݏfc&Cq~- ƯA[E5` !]W|7rsvUkX| jO!w~#d"#-y`D҂RA%#IW.F7@W!%TTfъSbsfdQQkƿ\|ckCa2k2 KM:OWj* QMuӚusoPʱuSZ3 L`Y@rǻEeܪ:ۅIt3t̹;Y׶ĮUz8娆ǗtG텃w HޭG֡E?GwtxY~IԮ5=|X`}yw,z>*{Ӄ|e ¦TA="vjl.d/̠>~pa]02 <,i4{Sip]qo~|wz[[SϦl /j=ggCkVzatobʏk>k9&|fHF=ӣMP {0>=u-uTz훋|xztn*XzV˷9o<ذj}jWRmKlE糼g ;^zvvh3S>y6|џ{ׇ?mh#GuI{7j{khl76-[08qã3 ~y|Q{!>>gQ8+sXmm''[PƭVmQ*#/9 xz)ً͂г68nC!!J57k۷^])Z~PA>Ί?ikSO>Ծ^i^:޳]j{-OX,-8_E�-^|!b.|}{z(-ӷpqGfu|rxty A=.5Avqa{*oEnqlv)OwNT%县C/&[]ks\͞+MwqCV}@Eܡ1Uj~,n:a^fCވ?xhɜ/e/c/ӖrI9SoBdΔF{HBAIQbwvejO&YwڳϏZ}%cOo~Wzf.K]HCR/vs4%wcKœt9`/M׃Οj)L9Ss!$Q n RJZ(M =w5`<؟nFk9?qi)7ZM.gz/TKԻZu!v(&2db-rX1=g$\ݥ{]&̘nm x38f2ײ.^ʣCZDL5q&gG}%Sph5Jk G4o[oaQS1\&ĐRцFTy'c5oMj}GJr^ 0Q3?vMIz+(s[遒,$F(-|\4FT)EJ`'iH%*T1ۦL 34EuľL! i3Jh,b8 cRn ݈THpPRp 3ωkhYanL.w / XiA{sM ,0J8I9nXGAҀ7=u(@OMk 2C䕖J66ra R"EpH(gg+ȲΆ ( nifZAZMꩵB(pY&2Й)DpUnk(&Ɂ ((⨔"iԍ^mϰVZSA+w"ѻk$1MOH5oP\ZgYǪ$DP:P{@:Te ÃTHrB\6G/j,$t_ IDNkQb58tyX80{K A1| (k@ح1H ۑ cl8tPuq*j`4sM%;o6ɪeMNb;l w\w5SuD!Gnh W%6#l/]J0.Bb,Z%|EhD8cwp `:zxJupHBrᓐGg!7zZS‚% ~x, ,DhI2%k\+\OfK@JZ*=XTLN00&mI*9))M\OwnnY)AB;ʒC& D+ϼЪp(k 6@hͅ%`Ph(f,xBf";p@W:VH a9 nKU߳ |z}a4FVeE]峨K-El0;zK&J& &H<6.pxۓ4\ F  fi5:;/Sew{'Q4(B L5z@W<\p&(J* e)DC'h}{ #,8 %h?s.GDU < 8@#UAeH (CAqGZy$XyK@OX$BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $z$P\DA=+ѐ@[< \:I'DI >7@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H=]c"`xH Yَ r$P1w @O҄*$@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@OIc" 5hH -9|Jt@O2J*$@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@OZgIuNJiu^/@CJqwp2(07w2K8 ~jXdfN iSgueuS דr볓ŰT^ug߅>Ti\}To{@);T yїA(|}]]iHR5zB$sPZNϦQ@rVU0#:Abj yt1A[TɲSvD5I(_Yb7u\xmԒ`Dj1 ,[B".v_Gx/'}.]`?DL[QLm:h.bAJX9 x[TSS3eHB㎆48H쫪H/DgU~U1(it~I*Q+^sNf/-ŗ^VZzOe^ުa(Ga,ͺka:(hսs' cfe[jVkSŎfe[i(mV˄c1v8cR_adV|Y3sw%Y{ hOz,1M4 ?K)< M5$ywgeasr{av^⽺U}>DggUwK*M;NVw+g_W5 ANN`)$]/Lf}Rizwp^S}d|1Ųq0(9/̥~EӶ.viDmd\ OgMof㛄\Ma_ceay 1QЂL>zxg7r&7:䦷nf#R\iwd+w ۥSjNjsCaAszW x/~e~/_Sf_׋?=Y`b8T\Vz^կ|ue\׽/( //W4e PMjT j7t'Prf7#65m5¬Ad;6mVy{^>6.cB ũ\1plǸGZ&vWUOO L~LEY.*1Zb *gJdeI~ҷ^p)=tb8uqRe8L҅ܵuSAs}H+'P1ikhZoo꛺雷} ȷ?xTpav^vT]ݮxѬBr!j .jJObAt=}L-_]667d5tZ3VS)TdRIj&lͤۻaUfE ON*f #γdWxaрzWQȔ [09Z[CD8Z)1#uJmmc, 0TfS[]oj8כZ+M7΅qgbcx5ɼ߽Pe?\FX;:}k6ӽl- >d(htfaЍ bs9XW f%WI-' ͠3~Co-큤:;X0H6A$䤗2KΪnd7e;mRҢZqi.>9 q*{ũݽ}vVpv)Zhy!Vn2hyԳWe[q4TYbD%]i&P-I[ YHy6"K"w9hZj ggOԃa&'GfLŝ)Yn8v!c8ySHю/tQ K2SN~zYj%Rbd4un.lQ3g+V:U+Ҥ^l3*ڳ,O"D,d{mh`*FF=%[," bXNld#smFTД%{p2%W$)II upy"eUP1eDzCj>$(C\bXg!D)o23tGG<@Z+ή`'Vbڝ)OAAęLټT#Y (d5 5jra6d6:' |3Ζ6g UdŠ4ˬ8@݀-Up5 Ԛ7>CYkyiaZAh s072AŶތP\z]Q0*ATM\zmZz xgYқ, VAkiN1J.$A(YR tL\ (r.YڵA)VXI'˘fIXJ 4n[+WѡV!_zyi< :=_O*N%=pRzk #Ș䎸@ Jd D8r[Ӵ~#YLRhr)fSh=|Ls !Y*9B8s$;U5B{4aY0LB`ez&&Q'N-֊8;?aFT|(+P MRia⡎pǘV)Ă5п:Q*m*-#Ȕ3c7,T`h\9ԕ<>^{EiN՚C~o޵q$28evlM #84W=R̗Ec5驮2 #us X P "׀g}b4Q4BS y*zK Oǧ6ckű)fpϤ`DFP@,YyOcL娽4e[ ;Ŝq3+lٯFNl-vPxG5 ywu@^uZ;fl#(-xݖyd2~'"!p% wFbL1VG Lˤ!p)6d4Du'&+ߩqZ`qI}+@" [O { F4O͓q$:57tv0FIoacj5bAԁ\!% mއng'H'}3YסŖ9{(שE-A#: /Pi6?+w/ݑ)1-^N RLpJe: 9\H*(@ "wBa:5jQGǨQrwAYaja9;aŎ2趠vs|>>0>_)۾W\ 3F>LjIWpFjz0y\I5\501\ὸ!}ae1ߛqƌ`#X8+hջjߦg|~B?ͣR!Dh0\|?$߼K 4Q\f_~]ƕLYZ"6%.-cJJSFϣ`/0NV7)eK_@l)o~z{LeJ4bN'}~).48)olś:B?5KeIiT!^*Uܫ&g׃8W}jAd+r9E{_.pVj<+ i+`29X0{^~'f"I,9.Y(極0坦htYХ0GjrdYpq^8&Dk5$岡ImC)o~}[ ,[hiӕEB fZAXMD<_i$()}EoYܛ :>D3n;% h"1SmzY􈫦Fl6N <) aؕal&c|[-uh0ܧ;z+F6I3 gP[]'y\2@m hƠ?l-&[DM07Z &>Zv:IVԉ~XK;rWdzRn *" 15%G* 1՞ ;I7x}b>dʰq(6 1ZdhHh6ǸגZ\$$U,O ,cj!v^$8βƣd}:13“U8CIdsZ+lK6kyB05%qL_,`+@iu'tX 3u^ѣf֧Rֳq :#̮5m D& T'ż1 _WsP LIZw1 Ӎ0Q2Ei#G%:%tIZIZ %i]m.tli,7:8+BR6K:1]IdNJFIl]m8F-V*^Kv4Bc cK VK9_ę7 xR{\d lA܍ uwF]>Z&BɕeNF]4F RJ/mPqxtH`%hQD2%Rpd4Rd@B  N1H)1,pE)mU^Nģx%^N')g4qghԣ\ w,,{Cel@űCGO4T Wɯomp"g`WR!(ZZJL)0">}8$Ì2EpJtvJAT1RfirzԅLg 8I'^AcSpץr? }ύijr+|BHkXNt9G#&vF0Hhy噼-z{ s ia)*H 8k(vـo;ieh ̈eKx8J-c1K-.3:<e,!]Znu '1bePr~(X>mGz&&VQ(zƌƁZ0cLWL#,x:$i.FOnw5o_>ClGpt3k姒Ulw A'z^=s1Ɂ႔JF)YD+` Ea/4-9󠜞J#ߏLTy0ZJbÅ$*tD 2,X$#nL޴2(l 2LwZ D佖&Z i ֝ |ٰ ʹ< #t`KE,"ׇU^Y/b@oe26E|`-C7UYߊ0[`yXt",=u1gzrVObJ䁓0g85 `:_N[e+¢T qn8Žkt]'bsl [h=l}~1) 0|]on^v|nCqWN2ta./ A@ɢE/Nw;Ƀ"%FLd&A=M? N(7v:kpX~vƣb>>s?g4܍6uK1j&ɳ>Z;:x#~IYFі1uODrK0Re:{I[trg?RʱIf~jDt݀.Vm]PvFbøf{v]As[o:vS&UE~"{6}2 yЯM϶= lɳ<Es?H!o㔐/Vx V0CJb #52(B;Nۤq&pagƝٺayo,c+'( ![T`acN`D+m%wCSlAofҚ(,qHBQջ(G$U]埧fTdZ.0\.k*aϣbW_z֫zū7RO^*^/mr\ nekSWY bu5L=׽1cEʲޝkEpq^PɃiuP?\aH ܤq% y]b ^nN(*û^hݨ^ z9'%U+YM"Qy9ga¸\QeU#y(h9 ٞ]64 F͆tjR!lC?GvD թ(;&!{cԻ=Sz샹酡.۲o5.P%hPREN7v\S;@呸H=܌Kurr~4(k0u,'g~׋Vy)f&J3 zӛή9^{ !FrP܍8V)(ÅA+?ǿ ?̪(ٴ$+y{2*/nVR/)7ymҼ_<2U5ib@d$d&'l-U7bf*Z6}ia]QIX )@GpB& 1Mj6+nD[)#.wmKr #wwU  _I $qac]HEZC/RɑEG4`[ gQU}NWu 9KD >v 4 izQ{leԼ;J;@^ۘM J4.44=)%Rva-,nuՖQdAQҕH"OEQ195D'L6' A( ;m5IH/1[CYZיmo?nRfQb#L ce )*^kUИxE%?E2Y(WnP̺8?T1# 48Lt"k]9+j~{֌aXJm(ؙUzK/^͊rIFKJb-ʷ;p5Ҕ%K3/BJIWMdV<6֬W(v[8[R*Df6ΥRɱAxD!'W PCa#յe9[Jg3Xm;r?*2^Rybrqt5O8-lx'74mQXed ڐB,Ƃ)""ZkoRjȸYNwpoP)Pl LVyns;|QLћ37Ӻ]]5vMA@1jw&Zm۱6;C7I|e,  Ġ|v+bbWO1Q4[YB&.E"9-2kDfr-񕲈 D4 P*:ՈHmΚs=lIT5b;u-"EY!GDON,^D%Vs tTB*rIrQ̍FdQjg!sIf]̤Zfc-y9KK\W-;z?̹ٸT+&Q$~u el{d&`'*|Φ(8cG5b3A ،OOFu^ L}|ޤRc*~Ag^\댺 f܊&=Br왽Vj9Cor¿M5.TS .{seՠ}nz5XOmd_ʿۭWq{Koyv7iyNYedi<; mf6m{%ylB2GxO9WZTOj]djDjDT+ zIFv$bRi(!4*j٠1 0v·Qj[d_3 dm 4>TJ ^Yb0΋ ۄܪݤz'0O)_lg$>hhh;k @L.s eXN6[uu(>}~FetVmYͬa@Oz?}Q gNV҃ޅ-'{60%+,dYChKAXi".%`@h%D*FǢSo\휣l~O惐Wnޛ<|xqaO'><H'q͊<7^y@2dt( "AKlHBv,y)#LmjzxWokz%w4XQ{C|{vFU&C1M* S">{mgA\P߭/^VӕTr}A~ pV ;𪕸Z)Wkuu4QA YʃF],{=hW3,ёէb=m' +$)g!,@٢"yp RXI^)@bNץ: vQl܍~u6ksgGe~=p1W>0J .sݫ˦~27 Kjf,g \AQ|Az_r-30s2٨c.P<ZXN7y%z%RUNe?Wa5ku@wLSS9 A R(/(@m.d \+n.Q|:/<JDb3RΊnyz I)f1UnnVhKz%KtcǛKs ˓?7#o.4~.f  -^w瞠pC3Km2[LMـv z<ԮWE%j\AESwAdQ%Ieȶirޕhu%%X;8NN留4ХJΓ{V h#X3fbJg$YD^2Y3=$ #H"nkD/u!G46ecpʒ(Wr*JU:!Sg͹]⬿A=p\l_fatbN19W=1K[lv,gWu9_k.xdwbB z^ lCAbd,wDzupeپ;eYE, bYΙ]O0GXϖZQA0PH0iN0Y!CS( ENikeZ8#MJ{1%yYsV$~sku߈Vk8ipƋ΅+BRLIM5l QQd^ dQHE3_Ex)_u`~ʲ=rdLʚ;@YhuQGﬔ5"V[59R}l oϑ\eqʺ6,Pg˿'cMVd:U M(hKq'"50}'_kiZEB\dpJXĨPZ y9*Szedv*FFy;EVtFYK,]JȐM&PDB0mR$ϧp( .^1čFސf; L٣Ql1(uC{ 4|ْ+ӽzTat èٕvϥ= 6桓}Q="12BI!A q?F>lof;#r@=)SȵڟQX7KMx@=Ra'f>: CizڐGIh txd~>D?~"8[tFs>p2 ׳q0 {OPeonP/~ר/}Eay͊`(:u%>ztq|CL.}7ueo]W&o <ӑb=Cw6bUzu'ܗp͎ՇZwC ˽?9dq>x/꿯?<}@*w^Ws5a Q~۬f\{O~]7zzynYxAzŚa,?s_=禜Z"jCf_ᖳORG8}IJQ 8}.]T Vy/t%{sǩ5!1z@`ٹ鼹qܥ7r[O -I6$l@'xpjcO1 JTn^D04.Xv?wzvc"T[P3lV@5'+Spfsԁڢz$׌3e EA҄%,3QkKeL Q\,/gŖ_֎}Y`cse筻nr0 b0cZUDm^3eX2Jթ@I('ʙ1V8! Y 6.W:ZN!k8.'^;gvhAtP1?~Y(#݁$Qx!"2y$.}<"b4ޛDv1ZqT/ˊyY,#2"LcJcT( <*<ID"Iy6‘W2ܵXVla[%G 29R>*I$ΰz %VKSrqj?MG) yvpRV;>FSLc08uᐺRC>BJ$rɢ +69L %h٬j]zuͳtӦuu-n7[[l#A`0ygom<#.KwJ1 {tk_gW5 2oENrb"gZ*< 潁@(4$i`qڐp^tL4u)Dž~ yȸ0p987myjZ\;{ޟ^ ]_T&H_hdI4„F:e, TI"$2ωkP{ta/!daaZ wkRTky12oBRk$U XAp$r@ "P-W>yb"YjE*,p6)0dqB q4ؠ O3.Cgz֚|4ˣ tDu4e[Qב%ru A,O˓:AGqz`O=*5rg6*Xh MBK  !Nix}sJ(*X!mz(*|_Bx z~t''*E&c4(9#\I2* >L3/7:D.-Ea} cx[HNQ'( R|u{>Xo{\n49G3 Ca&gVӫ|;/:\'ȏ4>zf~OB< Laܻ'JN'!xY)g8a^7p??q{~U:~0M-fN]'1^.?Ud]\soG׹eA۳Џ?߿}>{D;Xai5f=?E_|srg՛I.ҫ>qYunp0ܡ'_^( hrqzsW-huV /$9g|y=\Ǽxor#t%'G=aDV)(jxpaF{a^Su\Zp*kP':u<¬0'᎗~6¦=n"'jd v\7| a3KKiTs f";PN'ZM>[iD$8*y׫޲z!cBLp3x~CKhc!BN?C2/ŐkO|5*nZ5tS\ӹ{ a5Kٚ#|rxĎ}uM]z+xqjL)!|OߖO2{3=WK&m+"u䷟3mOr}A ;o1S/6'#V(Fo [\+]xҿ]FO=/ c~? jGmoI2 )o;cSӹi3EιnÉ9/ p^DwmJs o~t*S:Qq$%ýQx9;=2HTwJ {v[2 68xde$rL/v˔kcD 8I 6D!8kԊheѻG|3pq*Lbܤ")ꍉ1p{Qqm.&HMIN~rŒez{P`a[c)M]}}X2D(qK=3^JIԃDC*2ۢдW k0]Wm 8+(v%yv́*C-Aq4z<k# ip. :2Ea$:RjቭtNL2s:.Pd ֻo%[CYJhiM\!* XA:băP%F.e>@v}yOz;\=q2z4mo2wP@P7}DZ5`jFxt; eֶVcKʵGHs9e44K喚3CsG$(@$@Rk#Ĩdbg18A THY"!X("J Ɂ`SڱҥŖS](S~ڠ%h@IT%DqA'# jӗr]o WTC?ߝU4ڒ}t: ]ϴӛt: ٙ&q`O95bg sSf|15uXpuRV CUm $^Z",zt/ZױfmOfhϪףa޷~1 \f6!>WSWπ@3Jgp (&7\_?Cy@b*E{{;rE(L_<+L:__ϸYginf+㵹n.C/Ydt>kr_Za8܍Gu !sIįa- qsֻm농"d8򯽥[Y-rm;1f;P&e_Hyyt=]9q72w g{9R&ƩCG-S#W9ǒɗ,m01H0FzjU:e!(d$S[ҕnv=^Y~o!NJMto SfW?b3L]0 ë«*fxl[+dت,#rj Xϻw=\ѿkwG֡z&%SǦ:7LC|[O jpR^g[y^`T!:-*dU Ȟ2XBԅ@M;ߨBR*YV*.P(+k-fA 6L֜"YzBo؜s~[ز/[.&G;>ؓ)i| 3AL0 S,'qz-iaT)JiX^=\cDeuLԹJz{PFmurB*jpkmF8ȲhubKT-\`R63No5&{`(%fβfAH!"K6) )IxQHs- $"7&3N0='S0l%}V| 䘀 !ZP1,]YܭhT!XJZ m\MN6\X{!ϙ U`jQrP!Rry$DĜt?&r4FdL!`>ښc4ژ%dz-\J49/xMY{ri?8~8] _>N [l&Gt0!JYx&3cSɆĪPqn값=!)ٔڠ!QuYY3&j}ou՜-v5 ݛtljMV[FL I+9H1zYVoy)0ԯ=J*ĐI8FJXb@ڒ]H>Ĭ"C"x߳=9wÎ %ø5}Ǧl",MɒXN H`&2d7- P]mwdq@rGHh@bg)eĤbTe;O_͹GDC.$۴ٛlh9.]$>ҴA%D]DCTtX*h'+.r9X;vq[aoұ=TC'0a=fJ*9Q3.X1.Vs<&\4Z`LRZ ){J&(RuerT;Br!B2MKJuBD14 J3B`^BFlVHKj?EmzTߜ h>[m \VkC0rXѲDh%<ĄaF*91RERߍn K1I ~X8#"Ez  /Q{;ܭ 54k)ˍ=RXkƆb%-:K@}[x݆myI҄,*&g@\2uBY$3X ysYvCf$ZGqxpJ2ő30 ZZU{F5VͶ ⳂZizu)p&eC,'NJ;S%sliD=霦Y*BUcwq\WmoFg󃿿ڳxRVIeǿn?WqTÔ*^>7i+Օ;޻vfU{+-۫͏O 9[8 ʄ7-\30}x-Ab|L`z?n&VHVL#2“=I] []4T~ј b3j1c>VW?\,NA.. cniw5#{Wur]a~;ZcX1q섳nT_[ ܒq+q!c|LJ}߶4 ?bViٹ_sXBTLo6cVdl neÝ3w9z_ZJ-_F$$U.: ~p>2_pmvnM͛{iɜ^iY߂;oٍNQjiKwLRo?k!緿Vh՟|IMr7@x!RRMˏT5|r<boׯy] _'[S}1>2KJZ"2?}bͫqKarjqnd2shIqMFB,PB'R`eƕ$:VIH}J\%5:୷AJtLd '9:v9ܢ9KoS>g5b&\ rR9rѼl&Gs餺k3W$,׭ӽ%*1 46 ZM "( 5YR}v"ȍNݩ@ʍLrt̽p#ራҵE A"OCvR8aTJ!b N3Pfc[cc"E7F~;]{ͱC[7Fv[cϳ5/9,'aa^Uob}Z4 7qz0]Ftcl>D4hexg ʡom\}O^/W/Lv8aGɫU\5ړ}͚:ζlEuW.z;=mv1tEue?hd@=@])C L91IDפl!@)ָmxPAl@0>s7Nn)m[n}%<ܝb}>b Ga;RՓ<Q+%z>9ckDVTmy uRIKe3x)v0rB[_UJng(<ɳlQs>^>R%)u6Q$gظ,gY#n|~+zh~>Kx;4V_.R0"*[mVEƍJ,8R4%fIdYdq3:r;9OfC)ֻdfscP'[s8(<&yW_/] Wsib,Ya~L=u@ Y:y{>N(5im1G=OA;K9*8[Z-s? <OV˾@R|9`pESs{kjN=F}ՔjsUXFS3l`2yG&ޕ57r#·‘:bc;b&vÞ}ptt$.n5IMI:(PjPėėd]|CvݵY M oڬJGʍϟ ,J >`gJnTLhqy3??}ßO߿w?}~~[Zu@30OkOO~/q6޺oUʏ](i8[>spcU$Pl(+4r/ .&ZBiaP|K1徲#ww+5,a@%9褟;$8ʺoO%U^HB,ȇb<蘓*(ѡNerp~sVƥic :M>|ǕcH%Mgϱ|i:>- ` SJ(@9*]gG9'.3m)DM|+$L>YsBox@m$'GՆzLΈ2Cuo'8&gۥFbY|(=+֣dJRGɥD[S.R21b2okn5]}{ىLq&m p)gFN=e2{t\IBWUϐeFH$ȉ[0^hDM8&u,̒ 'zVm8;E".Kݥ'hi,chV)S+X ҽ T˷"KC~aKaOfAD梕KhF Ɨ^sBA0??&罟7ː?h9P,AU$T"(t1lV#RW>ZqTedluCuSDDr:LJV(f(#S@ Qd:RUԾYeC/zv,WMDU>WYV(w dH-Uȣ^MHXCPHh.H64Df9 Wzq ƚ0\MG- S;$o Wg[ezjZ}?A:3Պ mP1{%202`cDF\ :&K"H} >5E8E(|l dToà Z9>Ŭ*ǾwVqmh*LZMD!}Rf+wY1*)ނs㹂C9.%ὲa6a-ӑ`GW arFH>gL=[k;nw6lxe.44{'3ά))k X9qpd`л]#ŃAaPQk.ɱ!d2 :g1H.i"2pa1KF)G^SմF_ܳҷzZtp(`h۠j{vC}԰#Y /0E>2iQ2.)̂VRr"+6RKc=*dU4ff|jvZn%Du3NK-l#/HW%$|Lt8Òsc*{p4J|0OOSBvS'u Y.koS@.7v3± 7rHWW~]<2GhC^?dE(gi^u|־Mɰy1,YF(R6{GZq;^Lmx0\Id Ζ/tbo2\مˁ, KFpub͘ǓII?oq7X_Hf%Di8Rtmi^?<ٌ?풤ϝyLSuǫ 7Z]:otf9s@gѨ"΢ ^R&ֳ4 Ι>&DuA0>R%X>; *mU q3`&TV,[F4Y@!2}jW<b^W$0樘Ъ^t3=nfʸmr\q<}Q*Ǔ/t]lT= DQ#ӓ J qg6Ćn0٨59 } j;añ@s,_E $ \ 뽱<)Ƞc> y,\5d%UBk!&r})U*Z޶GD}ͱ96C` 2g(##)" U>$.qTDX^dXNZƴ6sBL Rm8U 7Q8*eyӔIhs2>/ZҚT#|[5#A^Y,䡖h1Iy9sƚIdL+͐-.1hù*k)}FlӐ ޴ԓ SVc+Fj m8nCx̽P`N3:~(!l\h:%ȍCkB1b5(*E[R]B;ٙJѹ}nx`:H23,"y(I@ܨ e-|by[`l/< c;5| e߲4|fvstjdEsoN?j[ nsq ##aql h&Q $2M(lh΄ߒSGG~u4(O>'JNi ʾi|k|`ߝ{scz,TJ2l樓D/T,aR<R]f0c }}],u:0כ<_Wm ;8P-/|7/X-Fiٴfr~q:5Vy@4~֧XIr+,7Λ* x_^}U: b`)H"lԖ(#:wb"Ι0Qтc\+52Ǿ `% 5wIPFw:v \on]?i[v;={!f:{IFژ df/84B% e **y,gK8~le:+q-k8*mD樝tܖ\AX^!AБ)ĺ&%LI,,rRw qYdNc$:vuԠ-5sI岁ƹhsȽh4\gṆd iIE\7JE+ ~xrqEʤ-Ls2E'3 Gd6 3Mҹ;ũe78oʧ)poQDߔ#e Xkyj@)T , 84cl1~&0dzqx4[X lF` ^}$tWEp^>j.A2KNHZH2] cE;XNG.5 gwΊ]c=~v[FPb~󅿶XۇCx~Dw8 վ j2vůn؞ \m\5V6#na[޵m[ltdG2Եڱ5ҭxwqh 2fŭ=ْ/7&w^i,ZWdN܍̯s6qmn7M~o`]r! ISa58:D H~1; \0't a 1Ƞwm![W.G.Av]9ľ . 0V߅yd Lnt,g c;{g![8] o7+%ȶGq98 \V")yԣg,;Wy5#G-kkCɪ*db0Vsm$ gWQy#RLX'nGSΦ"у2מm+RFxZ̹jSQs+,S' H29◫[7ض>^?+wGVz"%0M;un*IۓneW>.&IY{m ^H azeW*[E v*dς]PdفBK,TZPԅ@M;SR).T)T(![]3d P0,.WZ"4Am}9#E5 MNbsVm]Ζ}ݚp1>>'ƞ$֗NIb>TTH$4q^( :71'DFh]Мl)^<A[c!Y[LOr>m ZzLFo[salao-=Bj FO 2^azqX.gYߖwӟ;y(LȀcF: Ta$*T[=cƪ~:1`2!E1R4D#5. "tD4פֿs&^b{Mj!;ˆc!Ip%3'<ƀ^Z<ˊ->6Yi1XŁ2ib(S˙c"D bV!d֜aG牒acSh{vgwN榃 dI/섉!Ljn$M$n"`uv6@8)2ؙ@g$G,"1i&,YWsȶwhvt:{ "=E;xV@=$c(Hx(N \QsE^3g|g.n=M:6_{a3{&H6_*b]Np;'~ZAl'X1.V|4M hZك#SHeL&(RuerX;Br!B2MKJuBD14 J3B`^BFlVHK~H 6?|svp@j+d7 =BBE%룕Re<,JKI~7R,5&eV3 })),HxRdcgzkJ@]I[$g/\ ̃U!o'xnE]=ߏj?k{b_,q]qWc޹^̓)ˍ=RXkb6e.(_oY6l/)M2(:`.{%CPQ[g!k!EBh:(`.7e 1dF|Ԟ8qȉd-S;(۩e_g[szm>l >+Xj CN 0Ɔ3a.*`9`vPjH]VIU9 MT&j⤤ _m[/ǾͿU{6:W+Τ2f?8aJg/ץJ{_(k}{%{{>a5g g?\𦩛z߈;xPxE6~={~Ru5ݞ/K;|KݞyLUzjCq7հ%b V٥j~:&VPzXuSt"Y[2Vtߕ2NPiqttk<S\ƶosͨdXu_yc2r8[%t cniqV \}߶Dk,d0Sd'=CR%V`'B2?V Pmi~oĬNճkWsX%BOlƬ҆;)eu9>*-%ׇ sI@դMKv[S/|iV^&ZZ2'WZEPc-)Jc S4v miPտy_~B4׿Thߪd2{u9SQVU 䭪 IߜK=2=~md|2:!޲ z}/|DoM.d(KEkzɚWqGbriQ:Fa6ٱ̜ER\P,Kd# TrFY)q%HFFU0RpE F d!xmR%3 ExžzkΚ[gmlpMX`>D*1$(Ti\42[9dR /ݫVk^kӘL-&RyK,p>;TUjz F~ xU:p#ራҵE A"OCvR8רJCr,;@m1%(1t ?h*|cd54[cZs).:'0`^71/F}SD'Mܯ^i'L&(Q:|x;~]Si\NƨG~uR ?zTQ%%Q tߪ>g Z7#<"I+7~mN7=qj\n,*zڙXBD01 "r KTQFDоژ;ߺ='^̽g h,b%[!">X*-OzyPNUͼ! DP9d:f`s4r-8)^7|1]]e-u4Ⱊ4YY 8Njt- SqN:6|Ƿ妅7Cp9OWQoxΛ/kdp8SX]RoQLtޢFK%.]Z͒.pO)JJ e7qk? g-WVu!V|vs)( ''"9`k4^cLwWْƀjјS~dڷ]>Ӥʼno;Ls`lUMή NWU#VVsf5W )bLͨQԃL>j^98x82+kZUj.r5uTH|~wkד6{y5eJQ&uZؠ^ppvB??o;//8z;u ib_+܄ܪ/~ZQ;XU)&kv3bmKSNTmP@!M'rS??ŀ"Vu}u¼CB X`zeo+5׷-à@ON8IfcI2<'*~p"" JYЉb<蘓 yɩIzan\N{E-OP(- 9#! P;67`-g ID8 MOp˾NƓI\ЗLlvqy_Ãa_Iu S_iuҺ]XEc96wM[b0>&M7Y t I<Ń?EO%q4ćh%=ϒ&sQ3eDlut*pCVⲏ!(4 hX8 cgCbcuqjLLد뿦N款jck]4+Ki҃.\[HQ/`|zta~6dbs(8[(6i4᧵z"H% |`p6.~HB0:@~NZYg>e4nlo侽vlu֓G&>rv 2'&4&e $GJY$؀"aovR*YV*luLvY`wZ"CцwޚszM+MA7tZHw>xݒd'rnO> ga{|Ip}H+}N=?{WH\ ?jy `00`cqD^դY;XHfQ]hZ"ɬ/nhЋ׻_( 9deMj6J:h,OIPlV(N҇~aooGoLddAzfdӒ%n2 c2Y3_u6uO$>kTJ;[n/a%lC} )̠1G5o3~|;򒋮7XqnNF.Cv1Y 8LC֣qC2RMZt'rc%R6Sz8Oz,g:]])eO]~MwvS,+L%i(Fs3F%d)dC+6nw"׆AJ5\و}_SC] !3S1+zo,BJ+҂1R,h/Ca.I1Gc:iy h Rm9oU((9hs2>I(Iux;NaK2hC.x&9})P& }+P eC>C{Oxf̝)0g,1hoeBoVi 4`Y49C5 ޹"棲j>#H9L>3<D΅%.2mQ@Z^$er>YsBxP^6+ n-0*6O2'e͕/bxǾ*OT679 =pYz%SLB\Jd8:pQ&cb60`2:kͭO=)3NY6aZhF#7^0RRIZҐ#/7%nB&NDHI(3٦d>INVֳj9RG#jd}e\na# YD. YQKA*PS>w;}fKaS~fE+KhF9OAO:/~+28%hH* (UZύ Do5*22|:Wl{d⃰0z%t&%Ǖ. ' X`F7XxFb!7KZ[iM/WMDޫ|2# dH-Q#d/D&$팲XCH^lhs9洂A=k/QMBީՎյsyX^dD4i4ݐ`JHMΔl1~Zh~NF^+2kQt0F00 2Ǡ*HO\5 lgP z{;0>v@ZAZk0aҬ9'g+3/5R}g \Y͚BԤ% ?*oo۷`,ImN4_~VnсGU#?n~9h OimvaodDWSym|1!xu~}=V2H*HO[#duuvd7R֞f=KH\*ܟ^+;sG>-h\V3ZnW^2H6iEi,;k]mY ?F<4$;}Yp5wHEۀJF,~_j@ϮG,7Dj_hG]i4,r~tTyvV8Q ˍ]y#ֶ^/s|ˮgw$)<os:WZܣ۴9-.qH_c架ԃߝ{ns7FMs+] VeA/)Lƥ0rUju7 ,wS<0)o7cK1v>/ZSYvR֜ы!:LwV;'1-n#¢ l"YfZ s׬ EoGfd:raZKɭCJ|޳ *U8 OR^0Z*+M-(HF ,^5CiCr4r>z0zU^^1Uwl};O;+۪ GQj' g\Q]=U!Do/.CzhS\b#Fi62In_T<4_ 9T Tы3,r>ҞU0m\oD5XܸUijq,R09-TtBk^@I+7LI>EP2]@kvKrs :} ~f f7蜶9 4w'paޘS18PQ,g`QC+j;i+W٣Vrh庋>d[딳ɓq+,w)xc5F00]D#Bt$6G-:|'PuS!m6g]QNAZ̹jɋ9, ;PADyW{uUM&.@ܭ uNי.8Y'o Xƅzk,0I8QAkB"K Hҳ`-i]t/<1JʁKoyV2K$4$xҸ-c8mA |by[9dμOS"ܔ6~WmzrqqM{_j6@q@GgBGF ORE~o,KL-98pwGQ)tih/Y`mJD!0 OFEtBrY )UR&JqYJw}2_kPޘf.߯+",9$K&sY ALjE fLa(򪣼zKMw5ӰJ`}+agnx9[ߜɿqIv24b>M+Ϧ/5wIc+\/9) țUnT`]&Pk_ 7sII+Φ !D$ب-QF$*& erkY#S~[#%j޽'}.?N~[S;Yo6Kc#v9fSVQ"b23\5ܪ@[, rEAA%EkO QgJ\92KC $VYns! ˬ/y)zťw\Ef1:9!J٣kㅮ چ0\\P8 K.; FuOFW|VT!ݤ[wE*SKxIOp Wg/{"zzv*4rXSTnI߳ЯӇu~*_ǎQw[bg~PL AVa~n!gJi8_/8+*q%ϴqtK$2ƫUvjP(8{AeO.vk{u4r],^l&WD:#][}x/d1{wů;Qa%|ew7?bKR֖QcL=06ܑ dzMw҃/wi!eu05K6Oȟ֊s]f9"q\;+Jكv}}σk[o+;+t|w? y2ȇ~C=Ly8;]g7y&WCO~lI1B`b5^8i;SMHꑶ$r=&ضKi6_f !q&L~Xq`yQZ7NrTBnr`ϵ!7kI^g<4R3蘐FdH#B{/?w`!mr:s/$Dt>*Tkm#GL!Ww3wm0W-{,;gݒd92h&JXnwɪ_uU! gx "'JWnĹOΏOZ{nm7vn8dwU4Wxw-h*MSo1l|}szX?I눻Gѣ@%E>]fi>!sOm(Wz+k#5:A:hB̒@\:]L@:K>(>JU!IrvNh;PDœ@6ieo>Q@ځt D2$iHJс<6R5j%S< Z;#?wlq7/M>i4IcFEEf:irr:?tӓY'vqoֽ.tӣO;Y?Oٻuogu{۫u{/xt1!" q}Kg^>v|R?/ ;}2^{LqՍ%t6LÛ>InO>S*ŞR nfd֕r5Xqp(Z TvM8]3t|?;7 a( fs9{Jtjv_H2czD 8ew&_2IW}}6v=/ӧe)-_~ v.> })8]]a]WSu4馇٭zM.~GiT!tۚy:-LZt%6Rԍ7 _86BTJ 'DQ &gRE U6'JPTvڶ5Itwo_[DCٸZ̶N?>̢ň\@Na"cmD5d H@u~Uֲ(n(M(f]ZO)D32sŃZP볡f RϷTlCfzQś;^zvV/d]OWk5m*j,A,YyRV1zȊl*uM6Wr>`k+Bl[8[R*!8Гc݂x('W R%S(뭩0HFfGv\6ӌM4BO7r?_Tfxj8+l a X0rr FELѺ\~Ъ.R:kHtPSGusL¦26 Bo'/2zSbf1kh%z'tnmf;{WO7zFE_{!")Ww1([Y(.aR NKdL"{QZ+e ;&aTt+"]D8pp絠*0n "6ӏM5FDC"鹙BE'7чd'6 Unut!BpIxkgQgJzf"R%P*6>K=i@X ONiJ{<<ԁqq:sM:iɆ(ec\;\]D{yZ'Y"":y"HWVDJ #բ.~ \BH-;@Ȍ(DSI+D fi!H኶V9>TSf 죪Ű`QEI#SaJ4L B&W3O#*7fĹBȳ@?=;yMaM並)~uq:ݜ=>„=0H?%ϋ=!ъPPj )K(IA HE\D D͂^2 4m$H~BIf7%ضB/aqSl+Ekr/Lޮ(3yQߵ~{7}q߲][7F(QX`I7I 'ci҅_/&i{N*O&OV'/ ჆0U}&1Ӥ!_cWYUkw7mH_Ylb1_w㡁?^vħ3?m~qr>\Ͳ vc|g鼖ϻ/h:dOXo徹*,:8&ʬaZ[_.>K,=F nQu(ϊYak6+jH=Gonu{6 sY:[,4^g6ӣ*94e̪3A ;j_x e *CJ[}tHgHd4z7Xb8RQQI=IΦDt0h7xC|njviX-T5Ct =7TJh7WS ͙^Uֺs'gn~[Y:\Gcݣz '>EaMAG~qž؋0q?}I̧vn:k *\#W,v1_=@B5|?+ZF2 /D܇N٥7Cы4v[+?(4s)#ZsN/!^^3 }? U.#oqo(Age$.<"7V\^\NuRܸ3:'gC(o j{QmՒ@L:Z劧=xpaU&h7ەfZ;.[[VH-[r6%xe+r(]0\Iywܜ7vÀ<>o?Lf0n0g!{#gcgtER +Yi H<%BA.Ʊ%) 5`:-L$sX?LnI;/ɻհϓ"8I\\$dbJ,7Z@5ήEqC>7bdᬣ(!j--H`C?(iB@MV:`<6l?nAVҊ H)dH2he 2: º^g kK]wɛd!IY!u$Gl<<5["֛ZzR $TMŨbTEd>DE,,yPBElD3` k0V +p4Y)dBgGNY ^l< ۞&AgZ*իU:##eKa[U钀Ydfnx`̵Nse *ʫӀ^(T 3Q$ 9ifyjW Eq%cA&XٶPİJX6ā6E; +IζHLX/wqeu:PXS%2bQF -M MȚT\EQPALMF"h4 ?xP*Z#ݑ8dDYEͯ "PaPBHc:>.@A2֧%VmZn= Ag$5$\2x"+5NZzVEY^a-A4mU(!myGMquВLЋ![[ M.csQf,jV n=tU:9 oѓХE0I#EIlfaZSQ5H'#%$1Flׄ `LDy5Mհ[ 3bT2r7%xKD]ȡmM1WDsCd@s`4Lo  @b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 Lo )14"*q2L HݳgQ]`&7B1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 L f1@b&3 0k)^}JM[\.VQ_͖ĕ0/]ZsLI>-zhf5?u1'2Sfj:Qڇwtᄏ6ۍJl1PGUM8lLWro[΃l JfH-۰mVp͋SSְuWJ(ph(8}g;`G&M76E1P}5qY##}${ٓGW'Z>FG)oGiԊYg#/wN;R(/wVK-Э3ڳʓ@0 Lf~vOK\?M>C`t&p"=jd2V[ ݁ tJ+܁zz9̜$f_m@ f=湐q=oz"YΤ_8av&'YZdٜ f=je dReJN"E!Xh"a1x,w> f;;ӭG`M(b+|$=iW^BQZM89zW`f<_aK+a+а+ Nx5URDxр ulCE׎?tR.uUEA۬]JE-YD"9fj'[L`P,I%I_v??^KKK)㬮]l~Mw]IPrKD8XgiR3jD(sb V]saǢ-.>mo vn2Ku:>b"Z AY,~wn.m$/> ?QWKZZ٤tUX!$uΌ* \W.+Hgjdl1z\o0yŁ/ WYm/EoBauS +0 j%QClÙbiCac=7 zZ,OfK%%[$}q;csշ8/:eymW<*|b<ž,~^y9ʓVyGxA/69PꓫQ_o:wX?^vyvoy"p`},GW>Ԯ/*f;M?ZYzuBݧ]W .8j^tC[7>(>o;e~5mD?4pf%`%nލI -jm~?.܇{a^}8`O{bsv..fgV.rHS鸧IiܝPc\EM7g右D}1y~?' 7[o-T0[ uOq`H ! Hk~6INgߖo(ujYpժ,2Ϲ}bfH)lc?ׇ}^YʕOl=s^Dgk(4i-Ԍ*%J ٘#X;\V#}:1n&ۧ]TDATHӶ71!b Ƌ3 @K? s`/T٪8jM2yW.D(Cj\mE5PTr3>G ]~np ÷lլLdE!k"Ut>P$[|C'`$hBM]։*+m_{ `wY,i]Pa0qw \3{=S`o|9?`ow!k/<]^Nj[|厧RZ+[R]4&عdj.g3\Wv={]{8Z7KUJ ȒoS\Uf$= z*S7q=':Es9_3,_i9T-p‹id%L1eW[Zs2zL7֛b?(P4tf7ɏ ir 4'^_t?d1/okȮZ)e%klV'Z{bCtVIvZX%7sgKg%vǻN7ϧc7¾t:з<߯YTby=hyP o?7?m_~Zz}f6/Gɛ7/!ꇡu neͻ{9_g= +v-*p;@ͷF|Eu:AƓ_{=PYn&8No;w W;EӬ⊾z'.w^]=2h{(R\/'WY?ys+jFe_1X/.}8ŝ>?j^/ *|? iq*ժ/SF(Yێ% ` '/ iz_hÝl^X:Ul&ZRdmg7$hg*UO)i9_ߴ.v~=Upk_La9][o#+&A1,rba}}:9/c[v,{&_Ŗ/eH,nv7YbW{#{$9S%zAѓGOAOvUa48RcO$cO!~g|t(JKKNqM+㘮//{?qR'J 쭔eez/ YTo̓2;srќn.|vn6n86>4Osy=w)NNkb"rdz0^JK 5/RiǾHLAKGdDؐ4RJȆzJ'dSfN9hh(*Y9")P)Pst6)KmUMCI^$lت;[Y|*[@4`x- GJbIk6:e߀NV0E<ѝq`:k4Sy~(\K2Lk` D"FZA†rL9z{ s nҰ" sŜkcCXv7/o{7<#yl?;_Nz DԾ?Ng% XzEjI%FJ>X*`[z C6hw9*S`h0I%!+ S9] cC/h彳 `GռzMz2 NZs+Y}(5mmPeۻPRjᬶ6G GBE؞$`#/$EQ#{U PrvK{?~ũL`ʃtP"#AE$tmCm!gt‘s"E)"d)y[%B5Kuu-Y/ 5)\(m.:\P'"~!N'oVR4Ny)LIwt;>^UՙO?iP7x3(7գW!Y dAQD *tӽ~~W/}c/.'WPNUng6@ >} '~zZ,;B{ ^ 9BS: W|?W=ӋN+ˀܻ͌fH=>R*xѝ79a}*\- !W_6 -NA!nz7HmLX9$WCݙ-O4W>ҍx:k3ڻ*TZY{t̴y׉.j~ֺrvb~+#;v3mB=tq楖-ݬ8[y6%@}5볟n8l{U?']Iʺ_46k<Ͷґ?)_+V[[Ì6 )/}CvkrYoS4kQH'>p˭Y K͹b3 JIMC"c1 JT@y2[h<ɂĶҾ8 %Gpٛ$NS€:Q k)r24v56fn.o~׃TlCvNnaw70fm bFBxUzf_;ËwH3?| `R ˨G^W@t>dgGW ;B>J݃zY~'%JRT.1RE%JPTaB'WL}}8(-pXZ~ߥI-`=a5(>! AL{izk5-g鯓ZҊ>)z&HGRJLeiDu`9ԹS_ڣN`AJ SFQdLdAE *ɸB*mwe>}>  1crn^~RP0)Qdu$lHZ,$J~$YdfdK u w(f]꧔31A+V1S#Zb:y론 R$as8is~:o.< yIv ߮b0Z-*c!MjAcReVM?6A%[ (q0Ζ 2ƹTH6=9EC:y@A FxE..=l6:v=5TJoJ1x]BяH׾:wuռ+ٓ$.1r_Rm!q <@HreI+Ȉ(IDGP" 3XI \*_%5&jC)ᐢ7WgG~-kŸ&'Z9&y>z,!09!91QXŬD '09!$hYSjcڬ; G; )IRTZ+j JI n@HMPA(9)R뭕"۩nLaz[^Nr]>v>Բu&L.C7~@(mgǿw鲛|A<?-Ïv݇>Ԟ|}8̱[ein9QǧekƯG?:%\]xz*{OB̿n]W9<;2?rPН:bW1:Sn>%e7rϺYpEtϖ.?Ou]ioG+ k_ 팍8A1Z%p 濿SD%l[Mf׭չ;nR\ڌ{G:yzޫm~B?=-YaN׃ӚҾՀr]a7m[OAv̚h*7.cj $qB=$ǧ2Q1ytϯDVV{LZbC9%pLYݙF-gvIxI NO @5Gzp? Ѐ }1w X^l,9Ǐy`2'vSv)R B X3R=~)V?}˫h׿SUgI5E{KY7㍦7оyCPbߝ~X߭#jfÌ.s9.4x_eX%8B-Qck%˄])sj.AOSw)z.l&PXbJZ0| ڨ<QJOuLy?f:ҵ%%ł4S4=Cw>,?>Ao?8qw`rqMl~ ? Z& '٧U_51;>k ZgZԆaZ['HGgTŸIQq倥K*$ON*f #γdB[aj |P[. ~+D8Z)1 H&mKYAc%c$0fLE@+^!m'MCv֎WchE]1};42 ff ]÷y<@{>}_pP٠rY]%xJp++CLK }PWH`H6Ad'}1'YRtßXpD+?"sI_gDvSxFMp^pxC={q$`+J&|)(ͤ32e")zKݔlGC1,5N wV%KmpL28lzfrrTkdPr ]'Z팜-x# lInRo&۽-q!R2ڜbg_;:JC-kPnK%g&xPt舍aQ{D6C%}pA^,aٟz+Z=>j˵7)O^"4}zҗv\BD\QD+1[D TpdzD尽_fb`Z%sFô:kCX!5lN49B-b*}=64F4 Eps39'm۩PVR݈GI I@pV)m{_oN(3'OxzuLR(H0lo~ُ7W-M:ꇅïޫY] ҕ#T$P| 'w扗m + sp#dKY`vy]nivFj! (cy։P{$dB&Ne^h9ˌ)P "4h2Hu*iǓ%I!NsRCt[Mch('>(뤷٤єZ{#N8a2HA)u:YZJ'vQ su$Yvzk::lpr9VX<2ϛ骷~\5A9 _0I)̝ t,\xxF꩝Ҭ`eþi|N?c;FѳE@Ii3͜ a{e7~׿ķޢsJϭE~(w^ߪ=O.ߜ鲗e1f1On֑:|ɻbvwFNbJFR/vmx/\6MzqcoBIS3Q+bbaez_BMB*?8rԜ) 4#+hj/-ןg񻟡>(G_?|~^jC~TΎd2s:\Vb%9SBOS8zܜ:9nvꄐP&x_$\={7~f5L^,JV<3;:V':VCitZ|v#)!J?+l35\,'/й;ƓhuG+nVv -It?jimѲrcL ^3閩X-lukw;*gZ[ACڴN2sFr۶\Q],1GbSqvӚOL3kΑaz}WW}d*%&8Ι49I[%2u\R.2[~Hb'8Cδofx++O?m.U9) XU"5x$g^yI\PYO Di\c&zmw^P8k R8'. ֥p.٘XDu$*vm߽e? :))Tj 1 2PRY1KmԚܣිL<՞Jb@CX+jZK  1PR°TDY)CguoVS"'X\-T0fav~OFB4lt &!)ڽCX|QB*ZR1,B09Va!Ht"$U@knRwNgә "kTRpdf1]ȠZTuyX߫=+Zzt7&&=/MGY">o&%JXy1SN67 kiLkBi?ȐU>W-GPSF  7ɘT0 ߞށZׅ0j1!58E`0) 7z<> fmhw 3Jw+{8ݰ(^"fo@SuB4G> uɝ.JDrPPPHyTAJD_;pz;V#m+ɰ.D >m7qE Rp<9 @l6&r in:av.EU !JZcx)UFUUc Tݥ\cGT EL z,;Y#Vgl,Kq[379NPϜ.G̖%&}M)SkjҸOpײoY#9n\([LxөuM␻U\ȴ1ÛetČpHᖕLH-th1EWQ i L ǍpX#9>d;=w kV;Nm6PG _U7.u[v5phП#.BFRÇȰu!Œ V DB|)Z5]w*9cjoaLޱՀ*5Yz< fy])8J*,PŲb̖& 9=r`"2P>#7K.ptlv)RrϪ{-;Qyu'|Ϻ=XXNHu⮐ 0@H@%&2BKEL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&}~$YO`_OWgi_:np֍nZ9Np Ok1,3cO[KeQ._sCQ۟ct 9^MǗvh;K/c:k6=xu-z.aoWGҏjyxN4ny#ZYynG-YRvFr£ z[TO)k6h^uiګ=G-4^XH9{|llg#lw  ^wBG^z7&%cBPs5]>z//.h8?+˜Vz3IgNjfxW!BB}7pDN!'"$!IDH"BD$"$!IDH"BD$"$!IDH"BD$"$!IDH"BD$"$!IDH"BD$"$!IDH"Bm+昘@H<&UF<R׈ tL 1\3 "&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b6H~LhE@Fy@=]i99@=u0Ii:(&Vhg&Zb݋ $1 @L ,{V3=GGz@96&>@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b+`>ZoƷ j"\n&3o L {=0H^\wOS6dK͖yo!Bf|R~hUq[qכ8_U!:dHS-P\TkUˉ}On/*,=7]3mbRyޔw#Xö# n:GB'Š ^ӊR ixKQML\#X:kAȹ\{bRZsYX*|t:lR_oRWtw?F6BQW 1} j9kK:.0( @W4߱o6"/`DCkB+zCDdj+H|V6JZnC;w8|t,.rǕy.kTmQl9k3ws#|vQM62Y-.zE@ڦ|)m\RI;[{j^*-l{>m*&3?t'㳲;[a7Hke7;_/€.skí,q-l;Ola6R+m4[Ϊ8:ST?aIR2MiqɗcKRjbr FjV6 uo0uLY}ZW#) u'kR2?+UXٮ|9v<3ֱuv%\vu=kYqߛv:ڕ+odGsru|qLˠwGtre|4}>Ў@**}pz}@`rך3txҬ4{|Gd:=֩TrCF+J sT:η}]sɰl/~'kZo'ۿN\op~\WjNʕ#1<Ν6C*+~)l{ɿھdz?{ヶ9>ՙ**/kg hƎ3P/{^romDG/~/~l89'pq:ɓ_q΋8a=ɍ?/荮& f^;ɺfutvp2x&Qv?z#Lo\W: َWCj29k|j8#c|ʥ[`"Lߞ3X\ИJ1o\ ) @fAG+ϊ]WpKKX9Cڇ.=kR 1a}J@EKMYpj\Eᤍ}"텇^E>Ybs.zo~EyZk?[PUݪ"۾t&"G3MI-)/Eh/=^<=:\أyb0UГ^kZ;]jȰF>;]RcNҭTrGSJ]v9X!9;|bҹ/uF4]ivzNm/"<gtr:]e9}KؽRGs6r7~x~knE;  6|)n 4h5mc74DmDS15'lxuΊqm#6AYr.y݈oD5/wj9SQ-g5G)2,D|^FygS#5SYrDrxreG - ]aqyŗAprV9;lsݷӸ[VLIcxhي#;v FRæT'1`z)Y&;v͖mlgtsNݼrߧcW)g 5.uB|5m#Dd˩e@^ȭt8}>)'<씳AM%9BVܶ|u;\*HfdB5g-sh)-7T4if1:6:L߾cvI۝#}5lztgWN^۲4-uM3Plk <|{r,>nrכ*}O7J9Q9ê:[`o_ [>riz]q%Zػu$W9 0@?n,av;seLC׎m;b˶(Kp,VbUұY^Nhy'`.$I"G 3DoJiŠC6 ~' ωLz X]W Wgj[ lY^N{ 8AbVn3:ѥx0Z^\C zh kĉB=wL,*!ōT'jyEIU%Ĭ\>2nvOg3}6Wh4CXW6Zݽ}|o>V %I@qCb(@R#椼qmЍ~]RCŢ:z (~j2wPKiz}U|{mvf6bmSƟbEXUWd8, -)i=Ĵ$Vxޣ;8&̲&`V˚0F9HjY[ɊJ.R[F7nH- K ` gFc~UUpvMpT/,ވ=fKegdp֍.8HRUZEy)xS WIKeh#!֟(Lk\hEñxmJkg b(^k:s@Ye` a o։E4YU`e6+e[09v&f,fE[EP0]Z2V|Cy.bJ8Ċyd1;/Y-ҙ,ϽY-K $fce=CNtQ{,!܅J,fjVT.q. fn? ߟ_Do|1ş_I_^MwY|eq|zѼi5BܔO^HdE#w1Z KKҙi@Jݫȴ EYV.^VO*χYh^gGmV|) 퐧1$P54,M"IHq}E3&Ia&͝fQ[Yekx922X+ (B*D$<"q1|^9;8nlT Dm3 DC<*Vu/0+ưY+ӲHQ3s)de ^ b*L"mmwPnb2/& bd 7WJ,#7ˢٷk-{bRjBk%Wo&Zy~[mJh U#${XJ' .B\#ͅ7(vu]M',bI-KMJZUo>ܚq-VTZiom286FVAy *C\GQ9%*/`^3 Q5+0"ÿ ]<øff I8ϯ~F+۳ߧ۳Vl~hn#uZ#eƤՕͪayy| Y^@1\VZ-SS۩gl\yR\Mo9v0gƥaH!loaK56=0QE\%O@$r ҾY,9 cAfJ ":!%:ߦ\9 cs!^a͘'2&; 6A?e28ȉa;ԓG`Y:`;KU=ÝqG$Fb*6npXmGal=3eGTd $ARhe tC Of ++יMco*)xY1E!pXj5~nѠ|l J#5[6}4cӤXq~Y6vDOGhB%VE;-C[:rS ?̋QRq33a/z߀3|g 6I\+ 澆XrݝP!y:,"L'-T{JJ) 4)}D j1ۏi,DLTo N(5`hM*T1hhDl] G,{y"U$`5j3W1\Js"Q(@ ݦ~7/(l~dU2_cJA,D=*E( l"mCqp[͸ôRdC(hƼ%#TM~q&9tdG`+u*}=·qG-zL'+f$ڃyuP# d p5>$¶  Ivx ֛8ڝpޏfoX-Jflu/r0H@9ff-ed/pՇ Iע5r)8ޑ_s}-'8PdždM@Nfc[nUR2S(;8.3,=mKPQj"@Ձj?%ZԟwIElq 3}H`J&S,pȈN\sD_IRmjyJ"{i?a5Pԁ}ѓ-EQ@ybht'E-hKޕѪ91wDN>2 d_(5;qWQpB^p1Bblv~# NLlsv1OO?bXq㦤H5i\㗭EW?aҎjKքAk)0Jٌm0FqXݙStOi7; {`r?-tDPo \BKkn4 Jcd /-Usy tK`jr &r,&PUZh+teWŏ?(a*z/7QV;ƇM~AмP=AL 5eW6W9h"":C!"9a!+#-tsN[.DX EFk ,IENyr߼,+v\lMNF1ڍɕPP.?QÏiG-Nt0"Tv̘ <8JL^3A.{ؙnm}_N& gޕ< Ju -d׋"^Gkj581^LO2<a/B`Z*5I2媳-qAv6ZBʠeA,N<1;5ѹIȒ$tcY%4:MqMg5]( ۈ~۫in]PDq$uc߳$y^VqKMˡ,/4H4H3+!$2^eHJ[!k{5hrcN=X`&V,3c@Bh$4SBnx!s/n|qBe?*w_~!<vC{HaW 75zh % V_u0R8#OkƬLIJeV,/~<$ϒ468V762TP%8e*~4Him^mN:8$;dgwk!$C@IU2"b4ATm Y1;6ԚJ?FmUsHwrݶ4~QqG5i(* U`1ly7>r$b ^"HCxk8@14V1PQ2SH#0Ƴ/ҳA_`K}ӜQ@He)ʠy΂ Nr(9ZAY^/ lFx8۶ ucpV̟ Χ_|QƱX^6$DfZq¡ c*S_4Vqy[Qz?)";&rrOy.Ѧ^d&@XLsb fOHs{7re]3jŬs'Rc3594dI@RƔʲ\Wc_uD,Fh 8Pኲ'n\yG*q[zkYzYS=HCvan D9"e<&HE8Yo֯4sb+̇|uyv<֒YC-VR1X WDFʶ c1^Z0ΉNΩ*heiRYy7 ƀa"YQTa6j䨛#͜xOoX5SښZ> ZFt^26lt9fqGlѬ3hԘv%Q1%C)z{eˌ-:tF+z#+G\иpƸW[BTUq2$a|nQN>ۄ–M0 5%__:ҭ Kk4ө37GvM}vd0!䦳@UwR7èNZ!6jNM 뙤T]TUCKTuB;f?'H"N2h }کCYE1PV36kiY q ή8_@ L ňY %AP]%?*(ӛ׫$ E9yAE݂ŽY篅G3FZx䏙-:Njq3]WX9ϭ@a_~l63lpL^Jm ~2_>Y\`41t7wsoU9=)3B]M8GbxiG鼞Spn/u P٩L8źq??hK<C{I5Vm|u%h׾6̐|#z r.u`K*f2zγT<OI^R̴ÂpviJZi xr.3zMji%r-;O5a!0:kKD'ЊuD;_VE @ M!tIx 6Tv5\)'12< P[|gk4VeJRZ4,erQ#wcn)0r!XtmCD2'r!BqAWBjx%M"ºP+.Ps3*Il2ʺU Fb%O>b6A_\ ^,K8_=W_ðcI[S||ض2 ~p#sewSQ*lDaHI饓Z8ƓpbpUH)K9PY'p^Ei?(rE^E,7mn)qvNFheIMMߓnܺNp`0a[}.EP5DXx4ObVNpP[ˡoIKJ z[>1;b\p{[]vRI_RQŦLeG(K|xjS#&%(R3$Ojm0F}Ub>~}M[X&YLSK '0MH1Ӭܕ'Þ3S?tUuSnDO)gJrn4e?EqY j5yM_zkR[tI"fg'Is lMR% %v4AN 1-ɒЇ ]a4"jӾ5nwa1":Y!a_p?{g_?Bjp !E7}DHM[ߓ略EzKW@p^uULNbb1}Y%bKͅ8%4/h%̨5c+z!׃ Sq #,5c "Ɔm+xAza0=`+79A Â543VbEnX``(ąXͤAu-cQޫ<0bJƴP_sOxs\-RǸD)":sНTX5wbȔWDIqGZ(d*08Ab3G3X.&\ hU{ӫ7oc-TS4 [{ǬAe`ZR7.?~7.}+O87L6g/ MIAGbFӻĚx;+pCs86t ʼa͈߾qiRCo"x֣ ?\-=ֿQxdpqV HkʬA**7ⲾUD+yWpG]x\HIЉǡJto³QoO b13kh y8讂әy>*B 1\9^~V#Ͷ>-_1/t߈{\2lD gJ;~4>#%:'9Uęn|̂7)M3_k>Gd%a'/#Qc #Y-.+ 1sjh zx[4td>d Zn6` qqSk]sǧe/bJ]@q8b#f nCF _RrZ1boC@ښ3;8k:`$c|[70-\PЁ!0 Dwذ8۹;UqXbf HxECys52{ ` #@\8züAa}A rʯOae``Av} ߆Y&$U)>7P 8 K);vO껄>O 4!KPnϚy34DUmĸ uV^x >fύ[rYHWۍ\TlF- n8Bgy_)9a'G<(Έ6C54([Q> F 6~,Ыk`ROb꬐>f^u-kjU! ց SWT;iT %V8nWp\%=D"gi@ش[!@p'4;G5w:cT+'A6F:R$jKрI ԽDc0v( \x|s9uY N5leQFq4|4xxzYF9#+d1>Dٷ4>:Xb4/ >H@#wy6 ,A\~`7WF%O7Rd. 1 0590^ʪ-3[$Xljr`ܓCl;􃛐 ,]eW?Zl?ؙ#ȦPOSNd[^+k΄kџmcŗ]v,d)[Eq.70搮]8]Trè{%C;dLd4!Xy`18 ^͝A54ָy-j`è)KJcnU{DPkzc6^}|+ c^)ynU-qlU ]D{2afLmȈF۩B]h( uNNP 5^Ϝ;@mgÌ)5-wR $FXj*홛Zu\ Po Kҗ_>|7I|W;JwuV]aq7مh -QJ- B֥ -* S} 夫C 2Cs>mplhH@3ḭG{xrl}ΏPɴ='%qH`k۹a 81.c)K*:\ QڨK~_CCJ NW/3U7Ypпnsӛe=X{+ 'R'%H5tϰb5wA!0Q>#cm@4W tvyFc)gJb߮UY\-zPK+L |h.. 1DsKs;c鿂"Ψ˕ZUwkJ3U \ ɽSB(pa E}p}l> MPǕQ`FH{OB*j<W6.$>fH6gN !#w%!χ3ASC4VG+iL Wؓӹ=H4G/_WK\Hpх3=w>(]KM^4|j׻¬0 ]~eBTEC >x$(PbpO?(J8 ̭! r߰[hꒊr}K#LRQ5SL _(5A;U(nPU_(:YLGO?ī(Vu=0_&8gU1[PA|Z1z0<~Άl0fC4z`(e1R {.Ud4̆Uk#6w0qs•:8)p\FzϛN<j;9"HD>s|0R8U4xS_T42Z՗694<Ĝ'=̿VfX7bsw"2P~!NIw"[{Q8+ I`ϥO*?=F˓<(f$- x!ͱ ;YVܡS!a q='ɩTOn&x9T@f+;0M#FNfF*A?j k˅Av}&vw*[xeZ/JNI8R*tO`; ek3ذ: ニc:U"EjK$bM["DޢQX6 *} m !/ kGG=(x.q2EdBU餶Rf)¥:4\%05`u4qP*VpGBeߪ 1,8()iy/NCKt-p[n.1eZ4S*ȝuD!oQVd$Fu,`yFLA 2 .ͳ}썷ti-9!2*6~;{qy^"Z,qqeI14IWh\ z 8wӱIE/\7P멁D'+ ^''jꖷ d :* gEN}~KbC%5- VGW8A5WPG.CEE~{R=CE%WOvJpYYPV*kr#i{'=Y% ,)&(6+.h/6w^ւMvU:5"E^k3b,L:b?} R\7>+<蝹GHTA~bT?{zz|:s0.뽞z-jVxR_\}*Spf`m: Ti@K,&P dh;,gZ@3.?{5Qr19*0cEJoƍs$_Cӯ,,L3=l]ٖk8V/7\d|R}dSZ]Kͤ#n_~ |X˭d0IfwOv/7khv/7_nX*2F,$@pFLR،rn~i.or/$i! eM&D2#RY'qMJEkky"Ejjy̘(+D匊hCp0M~lf0lf.Sl9O8^ܯNUvYXX|~S>O7 jVT$98Z̦sW2@PМ"ΰQpiMr@H)g I¢-TMˏ j 9p6-ϕEz Z) cq)lZ {MIeȹ >$ɴKǵD$»*aFx^74%o|{3¯b,使Ѧlun=1cPY}hA .8ލWpM[рvjQ"Uy)&Ih5~m8wM&GY;j:XMT`kX)0 ,c"jTV<}5_֔ 0ȸW$\F˃-qB*yZֆ 9ОS aZ#*"J\Dn;+m )=w2ꎳ/BrsPTn V`en()>""nriG:fx{XHb弿iU߻L|6n:z4&xK=2jc-@ʙSw16L#D4ؼHeH)GSMè <Gt) ^3%ؔu[?V;yʺ cEz\hPQ=m~UmeAX') Fe|p}TY*8N E&qT 2Tu r. ι1\+wշρ^! I[n"sRkhtMszj?NK6Fa)$LiQD8,չFoc~HZzO>1UKkGZL끆M4cH!Ṑ"OV9ӎxEݡJ47GMgaG!O? -s:)[C1]Yw ˲0Ƥ(VG&A+nw>NFZ34(f$-#r-6Pono(EǤR11hѪ5z;>sgH~nXSUDy,r9#\IL8͟ _^1:Kϻg{5Ň㡴Cxw#S,}+{g p噃e}@9 tu?85a^r.hz?\F]Y)RIr:xTnh"=/(0т[ӮB7tE鬰=McH(掻ϱ`:\\}f, Ñzz `8<Kт\g :2d/$ˣ}FrpkpMZarJL~]~pGn:5 4cZ v2ᇷwhvcL]b$c;LĈg S1Y&!eO؀ >VP88j(.#x~Vyh'o$+=J`&}s\0 ڸċaFa9W^IbHⒺ3׊328 ]=[D3R=[E0~o2\'ciS>XL+/-|M? Ɠ`:NfA2!0f>aMϿb1-!dG"9gn<&Y>=ioe^ {+?QdX/엺xWT?O?i Sv[9`שyTQ<"0^~2y=ZjC Xax#mC6l$k}21ש@CvqA*AC1Ƒࡣ+4|9Z/Qe/]ns X|/ހ)kNZ8DERS-V6 yvvHbL;Gf ?Q)pݷ4h] F}Z6g,i7M3&'O#;k¨s՛Nt`K8ee>Ÿa/ɰ,woq?{hJ㼬<[ ~K>rZN柫[RrcweGze`#cL$^Q$ERk>}xxHyEUU׆oK o-rX3!/V{; G!\>t7Mw PL% w`P:Ĉ;?R">)IZ#__;w:?ug[̙ZR-v<6%WyN ie˓v6X)[e׼=mƖ4}-RTkKMs9Ţ=b-[zYglc!Qagc ;U; #r: 5%Swxh66L2!S_=8u;HcZݿYVl.ؼ{ :3[O'0?N>'48  bWd%T}m(){E3uo\50qU"9aDv_?79r {m](|Ze̾+\Ќ L, "H&tW+N&j70y'朹:kiBNcEhښWȽP?~8VI4X7}x%VXa*D,47㷕&V~GX1Ÿ c{7Xor|ϳnlB鹮TԄ3|uw+Lft(wU]]/R)֘6FS)z[=|COY#};LMO%ts55d7bnC>Bxݧ^c>NYk=%8p{ff._-c/?6U 1;'Y;r1#<#PAz*xV,aCG]e6T;\}Tĥ;1F|<;6T)|7# Y͖xg"̀椔 1 45`ƂY+~L@JmJ{IJ;YVAcEmq= G=MK̤ѶC4lJ I(-oxm9vKZ?h&A Ʃ,ux: ˰3)_fH'A'R-~WWNp.BT'>yAo5.6a% ] $Q `T ;%g2vR9ZW>S붕?r\R{4H Kd 4KNPR/^K[62t~,4Eh Ά1S+ 2û0ȿk oÀϸQX1Z`m~lP1d:~-U8a,E n͟?LvwxT 62>aJ:ժ\-E56&5=x)O&`BZFLAƦpMQķ+t'd=p^ I T2wϼxVi ^zpQ`i`I*h,뢲uwʣi?RCgvI[Mx &-sF lD=b521ALk@"*uV[y*hl]/pֻܩw左ڻt54aa2 ,,L$ob4#뢀bF%*JW\ odJ˺0އ  xAѲe^︪ku&)$ gпphV;4 Xae}=:/>,!MZ,8,0)z@ɰY'v'HrEht n h{eB/(\w~_NE&ݤQo04}Ts-9g.Ly Kjr0wgKO;m53!-Gd,%HI 0mozNc-!vW JZqh WfgZ.jߡJS"r,WX~Otՙ .|Num뗪tqTς)i9y/y1~Mۖƪmk%O Z$"hQMm"Is9ޘJYɛr+~߭ӄ'ҔtӄI~Y$_D~?4X̰Yrj(f\V[,yێZs0nf(;r|}VJRcLM^~2SnV8(ǠNf+ufj-bKϋ֘!Z/?; duqNx21J}PX4#> Q+Hp)& )#Lp\͘* %AS@)ԻHQR?kN*YαPGYx >HPU "} a xKhtkRa>W 䯯z9,ׂ5Oy-K%[CgUPY3뚩MjliOr8w5>ڢp r]iPAxC;4V 0 ʗy6)\搬(kXl)Z63's\"-TSQ5"9u4._dW%`ݶ~[|^C4V:TD·ҧ!ʸ"D9ٗz)Y//%vɎt9pLoUK$"B"nq]xps 4(fp+eN,- v =}y< ZU^j 2m9P#:{ۿ|.rȐj4"}*.$)]YrTAc.ozt'X~v|ViƨV'Ɵ rfhGo{GCg\e=1<(g qsirфbۭXH_::/ T`j+qRɡSt;;AuU]#b˥E((C@wmR7K!FޖCkwܫ[^ XEi=NZ(Ym nCFT)9wboP:c`0LD^e]@XB>gS`FS'">԰b) G? č7:'vil7J.wZwsTٽ:\54jvyQ *0Ç,&.ɀ,yUi,׻KрMʙTxwV-uΞBx @n%&\\}S>Q )yKUgzR6Pҷ|PSj/I;S3_7*X]k0QJ[s/SvH(RMt۞U4 X&v{jDQH !B@_`F%I0 6킶9̃tc/X V%. 7aTUuj;GgQ`Ε\(XkTd* <;A.2Wр}v'/1 H*H%W]\7ܗRpG|_,>Z{ 1L o.a7$g%a73FTE*֔FD2#n2oʺ'ЁzPdiCT=2D QTig}5W)*ܮ[V&s.+gy90}>`g}ng^V菢nW7 kp(% ‚5% E(rl)ۇ/ Et B_ppi[ͽ6FzOIcR[ #,b DzinWE8gM99*Ea~$VŔm?%ChT5R n6 X6".ے2pGG4e9֑ui7`$Ŗk49>Ɯxue0U'<7)Oc<IrA0Z6f彄S4Mu2i":z\3>0;{ϟTi/t82 ~4}E7J 6Yd  V mx,=A,FuL=^TTN'R9Pl7eǷuz0fqlN^i"ǫBsƀFP! c4f42ZԽb bUQXfBӢ?,+UI3`\C11 YW5+>Iǯ1gU-A<va i4ܛg[@[@Vקɭ^Ao; ڌkZao?=cޕc(,CCC~g>rO|ihu[DV\>ٻ6rdWzk4=If3Hf{ ^m%d;-e-KI;r|i#@,ub}X,R+' 2J;lKbp4Q_U|\?/{F> d^$trmg:OQ>XUDәSxj!%sD3Yyudq6'.B|3iP,# L?WOy߷{vSC i1y|Vxy06o]Yy.3_L&{JBwGG}6}!ܫ8%4o8/H}ϓvrJek}' ~Tͪ,MSؒӼ^a|UڨEEwRy۟r$Eliv<^r6]'rG`BTSxotxvM{(=SNf{U3Eu*v5k fb9@Wù3;mޫܥ9։49K8{dW<^ՇOUT#"u>Ntjq :e _?+6cA7KFZl1~-?ޔ{\+$n'(ğ$m@UjAI>(%5|@O[)y9U69>%z9J'x'Rcd :UKoYg{ϽK{*\[Qu}^nVbg8b7ˢ~g雗^G8v~7σVhH Sv<;M0%{<+-%٤(h߬v#;p|Ꭸ$!B0 uS nPs-inHlVR-)gE}>s-1I<9b4#fo(u?=׷JLxQZ_vӜ[2!1kШ.T߃r fL#@r?}`>yPw'~.5T=4$u-!#ez[ϐp9VnˈDn6dBCSxԷ=_YNնCR/rtٺ}U@o;gD  *["673>>3OӋ{VA xZQؕL+mj}䙙myޔ %Rw7%7oZ%oug7dp.fגyufexTi$}<2;%=dHMR8]uw2d6,%e(}sAeh*Mu= gE {]׈l0$wQ07[R;+H-+zQ6o 6{-5&w(5!YeCOJ&̰ n*BVȰM˫rRcb8O:$}8t\>P㐷T[r?E yeG%RcB\YN qB\2Km߱,} co9ecv0^J*\n4m6ȮҌT?З㣽jNA|+C12V+&̰vTP*R)Fs1lxA/Aý0>Q ߶穇 \qKm#O$-.+-2=}~:99~}3*#$Quh<6r~ʸ]W ʘbDlFxT[Wӛ0GH%H ,EZ.\Aqe_hYG7-t=CO]HwW¥T$QfYNg?h8etPkME8VMMpYgsBLN2'.d߈Rwr^_2Goo7 RvvtBG+zZ6m?Zj`}tC+gg dff9YBZ8/LKDk2Sh4C5Ne%e)rf, PT&H8$ ANh-'h3PƼ8;xƘNR$hpij*K4hNVti#lDu "F0N:EM)zC=-uA!JF4@ (֝LJŁYCwJ/P F ق"Fʣ$H`wBᚂLVj+GPy!LWĵE3:RI=5Y9#0(IRqڟS.jn.$B.B> L&!ZR޷!<3bCYD5aQNys <69D”brܡ3Vi{35dȳp: i \ߊoB4d!ѼK'8Ur;CZ);([@,p%ڂ՜jIVH.X4'Hw8e} :] YKtI%V$Fwǧx-ˑf78x6_,c9L-Gx_E^77_8Gd)g)_ڶN4SX:-5zsFϭy-ɻԒ "܈܂8GMRۦ @yC>Q;EwkeeLZ{I>DSmMS-HQlBYܚn~H#4p`ܴ{T\\/jӅZFg'>`~3ZphRe4@*@C. e<"R+S4A*AŽlX) Q3NȲ .樤ʭ󈺾SQחPGf/Fv],ۨQRĖвG,"|+5AY/Ğ.TګWcMwC|:ӋN*ŝ; R@R(e :4Q#p ˵F^TX\Q!:q?Bؓq#'QrɆ`# F:@!q5J NV!1#⛲}V65'ݰ(zx6Ym#Z>/K}v!Z2$JO4$̾5/TElu<>\vi Q֠cgUDN=XO֔W=Al/>2]%5+eNHnmr0J. h@5yNgg1Ut f\A$0Х Fݺl $AA&QN-DǴ8mߛM}{\viL^7OyZ4=xDtک'Z nߩ  /aLePFEyUNd"x6hib1KU8ѻ:FkI˜]QIE)bM@')g[ćT FAxv*鐝 O^2(Ukφj\Ü  NdJmSjA1$#m}_aW6[ 6i T_a@B#m+yD }~U\4 _gT\@j Du`(P̈́/ϼԠ3L! Q1q/{W׍d]1Y$U4`}xd%O&{ؒVt7Qb9d ٤7 OZE _K`Q8fyO2xxߴ VPJjeuzDgC*L̟NGl%]SGNI5@%BܡwsbKlps^FÁEp|`[DCjGSeJiD9<3Q 8RK*2,59#D1'+ˌ]<"iBܰ7{HlgӋ5ǃavuJ[ ߢrI4]Cs{ KL TA;*5a8h;6VP%{⃔ C ~NKr%^"^NajTl M]˰u{v:s%Zn-#VR&قB pɄoQ6?w|Wu:|7,g\X;JWԧ4#\)Y;ɔ{Wi5) }uZ9&$߯.v 1o˟_pd7|amp͈M2W;3OuUGH*hhSiHupiGfM-򈡘~`GяN伍l%@ 3dx2lWGZvC3@h&<ҨdbgS 㥈Q 3xE&I {^ƕ`% o1^rh"bHBLs62y{ \!j$ `d%-8#53OyjSkfALi9%''(1Â5I=9 d{s`1-$sVt.0*(Y59uw m3)!U3!x":QC9}:<\H#1dUj)y17lyDO*f^vUg3;f3!%!\b(K+wR@|춳j;U.pSӥ `*Q#w26W@@]ulO1L6'pY(Kĭ:11"c}fv(}u`=?ҍ\dED-Bx_Ȗ9ePZύz+rP J@C]ꚉ\]Q>5^R_$lu<8,y;7+0o2\4d;uZ 9"z`aLkn-oJZ _S\[d0#hW3+ݡf5(}r6:Fs}Ƈ] ŕ裃>WU:*ԋP c21± wes#ls̽T5Ғd9\AWEͼ6ٲT=;*7;v]Q8 >w:Lam)C`PΠ PsVW16cl[aP cGvb4v#"qr79EK' 8U/CZ걂f;Wd&3Y'5[Ȉ a+).uw>"yu~~~BY͑ nXR2,yg fAt(A xYJŎ4W1K&9#e ޝO!Y3o)o/An܀ OK^z?|w΋p808ϔyAK5̵jeK|O>c஗ ЂA_w/"Y\x1{K@_b]8哠MOfILt 4K%lojm BVGI$֩:vTRtlMP祖>hy-X>qߚG.M8Yԩ|7S!Z"oiObaU⇦N IZ(c%uK,u)?Y$njĠ:ĺϿE=jmL,)`L_/,v{YSH3׋9Fl5vY}\R CnʬKf.v^~M?9|c;KR&WM39 \^'aFŏ9?c]R֒=GL!(_o5Y* __%#9z4O{믆9=K>mA39򇗙©Uf10߽xw&:|팮FhKl*9ty̶J !H06sELr'侚 aKaBq'xR0rEnmHdߣkS \cEܢ3s3Ep %8ߐXe x#ǜYH -{] t+|)W/^S=|2G/"e+ʽW_l3MY_ ϝzN ~~Wϟo.erbA;ϛpw7#1aVOz?+=<4o=Dルҭߗ]p0uv @B3&'!~&VՋ%7eϛ ]J|XG\!}\,^U "L%|Q ݞCVjg JAr=%dJc zo]#"u]]hPP;k9g!,Cpu>3}=8MZ}=c%NTŞIrIeh|Ș&?IQ!o ~he#jyusxރ[E{pKc|,6P| >6BqG.[:N+? C[ rA~V("x>ltPp+>GqJ,p5|1Pq0;N+ǽ){O:sn>Gk}^ӊf`qsdq_ZQަִRԕ}Ԏ(٭f?KYuv+y瓕}%9*IY;d_cWjwKWjS;ObHN7L}^@栬vKxa7IὬPh*9ٚdbl88k"tE,m #>N@b#;=|pl߇n.)V# bA.e3I޴ixS,bw~V9avCx)-RnK`sβͨoN6yͤ$_NAV$ݙ`͊Rٺn,6Xȶ#uq,B[CvSaʋw_7qP/lui~0O.[5. ]` 1 ?֠16Dg_םDn.X=Dd*ɇEG~qzVZTģh%皜 |&fPA}}͍fC+5_Qp+5&ɷϞ"D#G{#Sm}1!jO͘<cf1PLt K};l0V;@Or8փ! rds&3yQR9gᦫeb5LkdjԜj NreFԏ*wJ֬U׬VC;%}AQW!j=H-Hh%]y> ~^+Jz뽜^6Bk~8—h/״w6o PE9U\U%AUUTΔyJ$ap*bxO@Xgx\_Us/.(@oQTO- ŀ]eynժF7\΀w#q5+W)NLߍt5q4ׄ e|@ D#*&v1M(F tYd_ˑ6{S bB zZaA~㠬g,s}{:Vjӷw%K#/YSô(G?3R(ݛ8:9Q;ɬclG$qJUB讫 k9{@4;{(aZ]R}~TAJu $Y|[ XH ТM Pliap3R?Zp7?}9ӡQG{/8V]FdgU|ۿ~ǿt˟?nGU*!r*/Am[G𛬵-M=Ak3ħ#8{vbځߛBiP歒9V%lN$wș3y Ar& Dp;ISi+q-vn >%cEZORl3&1 j=Ea82mnx]nDHa[sޤ&̛{X)$w wTtOQԷ<ޤvz$] YڤMsۤa~[6i]) =%iq9s>yz[gM}bl۪Cv݃BE26 (IM7;5B6i8JNpǫ C9#>sR3iMSE}kِ]u]E(T'_UKB8 K|THsK18O_+PXhC--.eh O}z01-+=ZLjs^bVU4t#ri 5-jVH BFA5jU#kza%u[~ryyn7`eRһo_ `dHRt-D5jIJJ!{ʵ$Chb/:!|̼oU0u:Pڸ[x Ԁ`X[wݯzp7|h 5%%̰ #V-SSCnj^")#sSh +dK/Jܚ[S $UԞb5Z8,ojImǜy`h[]ՠm5n[P˒@ĵ5x@+?~}v?q=gpe&4Bz1BAYzLRJMݓ$Y*|#%%䙊>8%XCvIw]矤׷H%k{xH@P4rtրlen {,3r3ㆡ,nfAة;eSA.SQfV[ - ku!tF|}kVJ)x}yٻѐ_@y}x~ݷ!L樓i"] yZfSzBL N!+-Ca@k'Q™WD.Jc֗|Ox\\͂m07[{K>ќ@~Ba0Y8Fh3(*piY+9|gftdFkn'/Y開OXLBЛ ٷ83njQ&"|)J=!jr}҄DD7@]8IIDj.Юg>(LJ{bZԗu>[SCSOƵEl(haD՚֖"( iԢgcHk#]lm Gw-EN: c@ Oqմ:KwBfWgQ%G%}rBR%[ec ')u cu-gYzȉ#}Ng]xb:nW ?>^>CZ0f I+0lʽ1 ߌJ~T)"9ׂI Lҧ+"ɛ2(̓JϯBS7qr9_/^[`?\*9HRȌ9!,gXĂ^"u_-(a]m"%?B,u-"kze,gqj?&~yr-E,=l # 1JcVd\ HBu5;)V4W՚q!4l`Ԫ(f 橳ǏP Nee{ҿ\Z]m.{1^=]A,%Kfh2Q<+PS"MXa47_pZZTl1{'2-$؅:0C3>Ad9 d0o./ FPWQ=,ոBzőZ<J赗<ˈ0tTzH'aۏftm ;(}z^ PVkFU\9q!u Bffs C)d&0K$dY+R%۷b*::'#B(%` R!Ief8*ij;7h`=8vZ;i Y#Q~p=2˯޾=2y+G{0ĄFFXۍQU0nSTcffcVM}m9ݻKֵ TEF4,<Șݵ9 #lB! {aKIf$He\ 3},}y-_ 2QZ@=6cb]6ZT Z Bu=ضP ȌQhRᘲoP.raqrG=d#𛐗zqwt|ޑ|zD}av{ev#@0(g.s!}ZM7L {n.!0;<s߆yf|j7}^cvR-Rp ȤM#\vm7^y\v^QQ; żTH]vm z]ݏvAkǮ[›0/ԮP,IPWNRVcvRΊ>y胧jbJ {!.z=9I#nœmj7 RJRvjgȞ+ja0rW]vۡRڶB^ P0&pYy7B K̿;I? E ne9JsfVn^k':w/{wFŢ!{ν<ݾݩ]`]#3%c1ތ|r=Jɱ'(4GnZE]ffb] 34 ރF+Ul chr8څ7+1 'kםs]jt0CԳ=gra)żSӭ6\ꐼϽ`Cd\KIfR9)k?'O1caƍxD#xrr_NGBH:cXۯW[T_vTe1lGk*:e0լK2ӧH3ìRq>ukRi׭laT:# U%K2ӧ)Q9ꔊ/%h!NKqپIWe3s4d,A,4{  lTm A>vf(F%iŰOHtl>؈ls8~99ַYDxp/>%2@3P̱k5_~%BIoW0R6VϢ*hs0*0=&˹]ʈԝ$3}; B ?ul,C>#1ݺks8̒^):1a:\-g) բTAo■En[z1="۱mӣ6G"ͼKɊdOEE{z[´Nԃ?Vjt;8 ﶜ4QҜ T0mOРsdcji_Ԡ4O(Qbp MI%w=ȸKӝ.%\O'ؿjʓ!mkT}ݭB=TGU q]W5_?~ޮibbls0 ( 3A"(dKrOM)ٕ4N{J9^WWg㝦d<h=bg~vʋT_>ۘ<h[9| sT =N%S}FMи;׉oH,+妽.52ɨvݵ9BOg4!I!fi-fﵓ:bzlOe䪿^upڦt:嘊JFUlpT&MrŜ`KIfT ;ww{p=ti6^1QG-A9Wid6ʲ09@1Mֱ$3#(/*Aկj0]S}Wc*؛*hs8*"rh OP;IfT/A!Q! %c̥Z#\ԗɾ8-k8$>ЩgIR蓛sw'm!e O/&o(eJTm?j͞uGn61q.O?XJi5C{L"x"Q0H!C.GQ9ʉQwwG%:Zl-8wL3I5@*&Ps@y\ D8t?g6xyE򤫐, !}4GzL\VsX]KBz4OT'!Xx)4]wRt եTq"+ KQ !}mRSg9 UA!() CeVf?_N/60om| lH;\˷7=?9{3л*@cX5Һʱ-oͿIO==\\5OP_I- Zi:rQUZnۛ쟍"vn-B(eGz>f-0P}FpMԇ`x5)WCDwbzS]!zv웎Ianru5b% w]..?|ZӀƙqiZXUS#WH: ]w|p[3{vPJKg:יb ݀3Ӄ0G |'ūʞ[~^#IU?߯I؅yj3fL\yİ8 HT? u >D:gLF_fkB dp*vC裛#a4d"vͿ4Z/o}0{)D%i o}!LDq|LՂKtbсPcKm+mk$zl,=NFhDFc`5!UCuSl\teAOXUjVBn?_gґɴ'VPiW*Mwoz3k4Zm5k{4"3=x'o; N=| vU1y%m,}6Yfiv,zD[&4Z5&qzn֬nX/ YwVAou,eA$yĠ޵9FX?Q'[ԥ$7ĀNlIf`؆xƳtR]3KAEd݇}w PꊍVЂmaPCjz7*MCEeY} IEL1'͘2`d%:1|XKڝd=^v/_;QV} SPNt^ʉWH[ى4*8;?uyH7ˈ# ˸Ghܣݡ +l9Tk0G&lP4i ;6fg)F,3;[FpL 0z95-nB0&Qfg {slRlX s,I [t\kvt;Pԁ+ԮPgz7_1=m/!Y e_Nl2[kՒl%SdK cQ$.]e #ucy ",6$S/TA"7$dj׈o!u:t]mu`Bz脔XO<Bt:ހ4>6LHW@j*(&uӧbCj4sx*%Xg_#&|ZT\WJ2BCh2:aArBtK_Ioi8QPCy):[H5M4e6j4N(7̅+ ->IЮuX`eek"Xh^02d!(ΰqV媒iN#)*i#X2X}kEޏf]OZqJtDעqW5I6:gXXZ.5Iu_. XZeku߼v&GU#ӑWG7!sTc2 M,Cvr=6W@_0)mͣ=1Hp9Ժ CoH([GKsgvX]3nXZX!ty6n }T&Au`I*^h2& q;&jzydvDE+1Q;M0:?;ST+w*O0g{4d'sלW.xeu7JYߺ=jƴNimoy)4jvLV@n_+<) ")2ccL5j@Rн:o o=&Bi aWP&R&ˡϫ@Wm%M*nuk_א!T}1HVq9@cŠThhlԱQ׀o\#R<&hiw^'K{8 hğ h eNRQT"2%TS崆 Ӝkܲ:L-6w{;+;^-sTq: LKRԡe!; &RA ڣB >&g:ag~: E*J$yd1T~}G+h| Lmk1;}EF!Zab rC'vFZDN-P,EYUA% `e,rfwݰքI@-&rjxy~,n*m[Kz3.yIj/;OVL9F'@"7V<F*1BIÊ~pVoKOW:{&}M DCo>#L_6/ h{W(qax@q徸 ^s&'hH%LboPy݀l70y+ccvV~ y/#uZq6/T#M7څOphXkvP$f'.RʡVXq#[/<ˢSd.(dYچ3MtE^m]bmtц8{hO=EjIcUF7dCTX sTv^LɄ$Tv6M.|s9 oQ4γ+qO\HQ TTRN;rsdW" 33Nj9?)?h&s=!$5h؛ݿz?BQzS\y3b #3?єD.`K31j P-d$޿7M,8շV+pCTш :7e<`2g[G?Ӡ_si7O *6FSk[|m-x`[TZtиS 3ofyl^IKl{zH#ʆˏۍikTY;Ҙ^Gnۍs,@,H?b4!vQa% m]vI̮OR4*y¿r&{I ac! eL;|}DowaOxڷ&%^~YMCm&~(ocu)>5םƑ 3_O6\(Onon"fvm] [^"m{ln$yN+m᜛Ϲ!5tsSL XQR:e}X/3A$طLU+-yAbA]U@#FTCBuZM;s𔃧K!z][g<5kھH^kx\S7A} /O?cd7 /d"$iT9AF0%z;YޛW*2TT 6f[1׫xyh؛kw!Ji0%JA+DQ9Rb^Fd֖s'l INjZpoےg|%NIױV*jѶj\Q 6b*#ހmJ(jJl`r(sDMnY~pϖxeP-y/7̅+ -Dp}V#󰤫pf}/'bGhKU5&+tY"f4heP UQ!zC/P-yۏ1@2~GƇATjֽ1l@/Jtd7U.2ize^{AʃX[((J e1w>BY8 +0J(1s.A>#4z';=]ƴoqk-EpyMV]^Yě-OxOORhKb<fIj&@,H?b4!>͝Mpi4L[s-P}sZg»n<?&p5GggJjN1</{|RЍtgSPoʊM%.tORT2~c"iZTh4]ɍ~,x J)d)|ĭi-=U.GwLj{8}}sXYvh"=Y:_ь:!&ģA\%SzpkcIO}[Qmlq1bxW^M]h%ؗ+Rٶranܣ4p܎7tKqZbA-o5l *aѳjө|+ηj6ȼɷ}񭚹63; dD-~*#CTq ! V/`/k^镍 st~2u W*_]1[H|q}OGNRF$kwdf#c-A^vtë9η?hL{:C xU4jZ{Lf[.R󏸐 H}mr,Fgb\Lc\v0.Q?&-t&4u1F= !,8d憧 @p&)'v:%!%|[Et2J=3lj^?{Ƒ v@8$ڽ%g#v!O[^)g}!lg(۰-9zuXΛZ[]93w izpÓ|svܟ_]N޼o:yIiH{>v͋.؝I!z[: tdTkƑZ7zDu5J"K*ױҖJ M16bUnN,Vx,6Q ؟\{(u:/`62b@tSvI>FP!\jrMٺ3 jT8nu#5R!(d/0QJp^|& }M^+& aRPZ15x>/ORܪܯWa*G|8?$V هEF9 PJ7tD`>ɺN5lB1( _ F\M'B`V1:^ۗX 6X׊NK)R TVra&D0 -cj׿U*|soak2B1ca%bHd#]:GثP3PZ\!2kx4yܰX0#鲣:d^e/?)|Y3a:[+8_=mT$5ڞFۓh{n=+Rae<$)ʏw^jϤBPGHy[jo*/g3$?ggn4Yt%|_~8Ok[ϗQP[@Fd jJJnhr4[s3SĎo\w4< s4 KbYX#c>pn6k J)UsuX{2%ۅc*\ַ鬇M{peſ}NYct:Xhb;5Ow]|_{qS4}A!erfgVLۺ? ؐC _Q2cX1 "9Vh4q]I1Fu"22[bP &,Tkk679/vZǑT Qa"F(Be ۨ%W7ng:J KS(Z-ގL5v/J`O~ik]H~M J3&=!(94CdJkk*ְ?$gN{トEJ9s g$BfzԤH.Bd5fQI{=ڞf?$鎓,YYXLSn [vQnwH29ervg`#!(`/G,C,g3=#ZGm91qJK060at 8qB+`X9-VQX}NAoŴK=0h )H=zG D(t}ԉ۳K]RK'B)a.,Y$^1%<WzG.2oy[*eVۜZZ#(~DQiQqlhVHJMbP kgpނiy`zor nSf@C\H#AD {^MA:Z R5Cǹ0Uu7,`.J&p|.l0񰖳W7WW9K|."DPsR͛UEռP RA'㦗F"4_79' ގ>K__ӐF4Ӽ}2;wyM<-M_h|jhPPx nDa!x7Zb0RrhMz47xw%]+ ɱ,ϓB0*Į][r=6^1y4f"oGN c<ϛթ1暏xQ_) d8l(0~4!gh _%C18Rm67F߾_L~ߟ_wƦFy4ƣ]%x\m Ok9ۖE}u8#i9Vk<h$LTJQvp*-s:f9\N2X늞檞/grbN?;dTgW%2KȈ1! #r2Be$iTIR&m]F 3F%2bQ+0RʭֺaJ_TTsjjPSYgϴgu)T!){Ve1Luup2)Z޼<|Spc \1 3 b NW[ޱ8s:{\!_yzS=ASUPޟ q38ߕ.iݿ/9NG9f ]~ggBhso[ɶGGTfj1o!vb}ҷWZc_Ε%&-kdo?06_;Kl+(k][9OW>xrҩKHׂ6>^7[NU+"PUHbaZm; L[d8<ںԦhIDz$Ш-D Dxo ݗkMlü=|Y?%mоV4,3Ppb|^hicVm ]Y0oyIC ʇ<瘨cy3Z /Go3O+>M;#?efχ7=ed@;V@=EN Upfp$ 4Ɩ' 5fAqOv:]yCSoϷ7u: Im}$qrV`/R+g8 'ʷGCpyy.k㫏 NZ @A$sKgou)#Qzj@%brcRr<*]9EEsJ_~1%Smvp)Gd_UAUuy+n( RA4VN}(" jK{s%XF^ߴ1 wW6{#[e5bh pK0 CU4hTZF8!qjsC"7a~x/ R[Oo“3g}_獺9K-a5٢VH Jbcavߩ7L6NWjpII >QS$YMZ̴5\Ȏq`QN*TDJ*a/'XKX7[Q@K^  Ж`e(Ǩ_ Y hlQl=Ph&=pgL: "{PZi$  `cn+@[y@V0X9A2W֭Pdc 00` xv7#WEt:"ng?0vgB]mĖ!p㏓FZ-5iq2gB(q҃ L@ D6e".3X&E9Z)ҐT˖Y!|N+PtH AX˂@v8 teL%|:ixL|yq@~~ݜVXG7C>YU˱tu%ӭ_Z7<(Că 2L*٣CT YQ$٪occLJE,:xp O\A!AAsV MF |$A|y|WD@v;}h.`24ILw$!q&;kq3=M"Nq4>^ox7Xzp,*qA.[8#׼5fZ?[.!"*D^6gF? t6?<\]]޼O(UgЫO3Wf:|O~vᏍO6MpunsP 7&D+Ѕ(?kr_g?gb@0ꞣDodfv t~#2yVLۓ!Euz+.yh: TŒpNnujߣd a_ //5PQAh4XKjqlmJ#uvb IZ,LAVɣ=2Lsȴ_:9ʻRڢ|ӥ/~xhEZKANb4;b9FVy ̛iXY%0I&$o o ̃H w$cD0UN{$9 [aA:I~g,-}Qҗ03,<&sԳ\mZ!OpAxY6[(Kg dT5Shm7\ȑ.?{Wȍ0_M˼|HY$Hoi++<3;n͖,8Ϩ&*V=U$}H2 Q|D_IDH#8(&h&UhrH m0빯Ώ&6]?kj >0!h~nO0(IwB$FuNW"Q!*KUZhD UZ4k-;~!,Yeexy¡\{W)ϟxsZgN|I_]J~Ȕ"gs`'"2vp3ƽxpP]QE!5؊Y+44AR鄡]yG3>ٓūOR~b0?9:ݧsZ(Ěj kH_I7 6teY%]YTIWm]PYˉ5ϸgpNpl4(x/1 &)eDVP1G  $>YCq( S8k5sU7?K D7 ѕ,BQH;YeVbqn"tEV4\T!\i澞.oY$6P}7 QJn_P5A]84+lwpH; d_ON{neo{s=V2iUƤUn-L vu)VWoGYMݔ}ؾѤNr'nPc"Ҽ0@>q:a%L6̀?c6d*fӢ+s7VI"{8ַY Rز.Bb%QLٍp$._ Ηuڔ.߃YyGN_/cR4潅|Q]Mؼ6SzA 9\5es] M;FZcM:,M3hV[vI&# *p҂ڊC0[=M&jQwޭNɖ!]" v>@+fr2 k j)2W.Ÿ{Z$Hd\?ȲvG9>iW%SW6o-}N}/ Wk-6ζ<Ͷ7$=.<g.7g+\6 V_>PlͿviGyo*,>W-G|_2{Y,&pz{ر=ԩu'4$>C?h_ӑ]L4Mȧ=Ɍlgh,PCN)} [Qi):ϩ K3ysO( '<PKB\bt%[}FeֳY>o[V 26Hw4tTZ_)>etQFz@jr `%jT믿>MΝ~Ʌa<S1;P@(Sek0s72HndGL|{ffzuz \V ¿[CZaEUv-ՓWO}}jL%[a{YMV!򋥩Dt2Ae[Ř!0.EZJ)EF8KkaՍ?3X*1Um_&>YWH=n`H֒k^}QXOa& A( HkرLXc.JD]x&ӕcB05qY2XcƔ69᪵V֧ 0ym*[j^&'O@Znh  E|'G:XP[K7>>?kt6ꬋݗ2#H.k2 lWbOsJDGkY,Uf8KReze." Y){ΤcCBqi5爁YT@^i*}y{`צskDz0Y* q Cgu2#* :@F,4AWDTtH) T'SQ"(}ykoNGKvxYmgy[]XaQkE m LYKP*Y ˥/*җeG <$@XY%@遒}%Vi4C`{fqe?P9"7$S.-z0*!D}tQ Pdhap)|Mޣ8/r+މb0,P%ZF_灹8 &5ˆ5- $!S̸HePX`fw,q x8QBy'GHEeK,!Jpc?*-Jx >F{--2X8Q|mE%(n`31C"C 3*9KIe ŽkKšS2VMDkPL!"EB4k"66&*|!ZNXK=E {iMCYNGFqlNrq|ͅSQ}J)!aS& hn!eS/i,тOu HYE;)?3+e*R1QEsr^9\3ND\o~^* r0ä#S ԇyu[S%(FJq^0M^9ͰKe`M n,"@p$* i=Z(A  1n+eC%2.j.kBS.(J43u![sC Ϻhb[CHfvdk7;SBz!`kU&{$漽|Jtu~mpvyemz|q>Q)1E-ݼ wQK7ZntUQD]W=diXT6ERga׆2/3==a2SiaS#o[3,贕̃FÏ㈙Di3HQn X?M:ţ+%GEa-Pf]*r@a<`33g+Pe"k g(jK@3StW Ѣ:25;fmFv%rI*eTIֻ\yh#)Q"@hjUsct)YpŨVAlfs;͚,2k6RG9;UX*-0[ P!8>ZZ4@vR As']1 2uxkƀJl  nLrza;KHm8S&YoU;EI5@òOɋ&N(ɫ盎uq6[C^<ppOću{7 pcRG3&臥 M/qwu8wQk'w]oQ?$ۭ/O!cr)݈ gU&4d#CHt$%""RSj@+in2afOԨƻ]ښZ#On +$C};[?0ffqٳ))ΧGpSvu4F>Q*EZiĹPd #N]`m“ p+Y4V+vֻA~QeFϻ*ӿqt$lpŖPZhdULI5="=W'j }#BnQ0R%ŹѺr(&WB/~+Jm3Y,WvQ2Ņ|bG( "LTv)?v*^H,EXDT,&d`Xk)$ V؆Fc1cEm=Ӻ[.XIp^z%Rt]hX >7L%B`lL+߸܂qj*53_c1>%sby.r8sዣ |_uf{?"7 {ܧ~M>k{I{_tQVݘ rҢ͓eQYI^tREߤO[~Bt Rq̖zNSVsoBw}ѻ=.Nug`O=O&čN=}prx5jPhzȋӌt28x>L~=N~ pW?O7aF-\Ce'[d|˫{KXypczz87~NQGNu۱qz/6؞p&H/M:U*۽3ŧ-!Qrf, .3Y?X2zFGWwQ7 (ND/i46: z_z5Hݤ7 Vg6ݞC2M헝;{$ht˙`(߾`18>~śar E/^%O~lV7?퟈/|SƔBLXxfҧ2 ہ~a M/JQ~L0ip_Lhr0x9k_ߍ _|L>hO&$wRU A}e_F{pqV!t_hR(y% Lb_NKg6YYC>B"/WG\[(;8Q2l{4vkV(s}JIJl$7ʗaH&ǭd0oEƊlCJ`&ܭ"+́`|uƐ l.F\P4b8.ЌS Iyޣu̓x(sEY2\6Ѝ-ؒ;ؒN^7dcK6VF]cK "r@ "Ĥ4"=3[Rcؒ'3%26`=b].@uMEuA|/T׽\lʻV-9%D5E|^2z+m᎝mKdK)VJ=B8^s9A 'dB2 o哋UM,apИpeSbDn4A4j*0gՏ >]J`/GcG0A_": EƌpECfRxaaal0 falJ6_'ZLMTMgA2t|M"*KaF_l5Pڿ,j Mm]s-xMҏWJJ9Ai&& }N"װC=𸻰HYzNÂѦDN}:+) 5Ls ,9m#_Q(sI;OaȆD4gvS}vpZ$m;/\|l#66=]!R6Q7;),F2uAF#axyƅ(MM-1ȊTT-(&=`r!nJdҘ}h-wgd+P5awf 3Ǔ֯>i\%'D Y-94Cy8e}7 Ү:u-N 7?Y6jcTh`][Q34Wz&VX˖;c:Jż*~ ]n?Ex6Dt(4䏭9ٺo]ލm3v>g,V_ 6:sϹ ٛ?9Qr0O9HY%0V(On\0>|Ns䴌nZ T>ZsZ]jf^=|o^]cnA6LU_2_j!>agIz܎-\18a`w3˸,"Rq !0FU"2Gk)2@T5[gs2TZa-]9jNPP BQkcb1'(*Ն[}> 1&'m@"yKvassѵ|WqZ}JgȽ; NE^yFC4k< E~;-v{Z_y}ԅm{ZQBD~&,DC$-#lo޲P޲Bd^&v{c%o!; cr~VEP `p`cJf`]U++> ktBhywCvDIjt]RcvZle=A̞OJ9`{7SD[ilh)"] tx8ZdJ7bsԵG1m.sV̻4>(t!ƜZpJkA{pޱ{0PPvp n=. `%#M[C' l1,t1Xp*ZX:X;^(ԥsSj8V˜=nh*t)O9i~]8H3{(4elR0K eB1#`aU(#k:6TЉH3Ĩ KUf (md3S@.y ?Bۋ^yؓ`oجV"PYhaa5G\ŤjZ6ʭJ&VU*I +1@㯚-7\\a~M bR 2VM ar׫n҃eJ@勾i>~׺u'n/?kVi!?QLW!v.yKT]Yb:Y8L(8$S"찎LƱ lQ42E8$p&߶Cj÷v֛'EQ˾ʪٶ8T o`GYpq-aM/ߝG%r.3Uk<>8s3h=u,GiNN^}P%ŒS bUf%ϕVչ^>~ )Q8*q Q#҄"y7J(Ťl paakHL%H:_Zl@eNt"Kcm4D3a](e&DPPLP(Rڢwfhm ]NOb=.hܹÏuAۊTJCL.hMqtA[m@WMk&##18b;6ѷm̥h̕21Ws1Wv1WUt+s1Wsesu:sEyLAx ,6P-3W9hMmP2(DM1heJ_\7ҙ<ƋYS,Y={fQ3"+ݤ,13r8eV;aiL"6@-hQx8T3;è P3:JKяJjE*F(X"4\(deđaĩIM*rc331<^d Re)dh+*r!*Bd$zT `7AqKE"`JX>aiHig2(+EE),]yDljy#G(,DǒRtBgP!Qkm#I/2t!X }Ow);`TIi8=ƊÖaOU],!sS EP`,F.#7 Ң>R JZ .*-_B= qNv_G !7B iycD2 u5)_q⫅8bǃ P2*0x["I#RQfIIɞ#IW7?D F낇-)O s_?% #E:'rXgv+x?1.0*'%WX|wunidk<B#Ȩjd0_jc*D^BT" xkOi=̭okͭf$ʼn(%m5Q~]-@ mgߕN_ڵ[Tuxv)ﶭ;:WCS*4ޔ~9D > ViaʔP95xXzֿvHu4_ /?벪[&^Xsqhx9zR]"sbGQZ,/܇X+%,PhP! bN%E;2y}vnۢY}ŴRlݎu%d~WL166;}#IMGjfH6%D;LN'қO!N7N\/yެ%@'=G-z&H`mL AL|7m2Jaۼ9} ɞrŢ4ܴauܕn_r{sX-h īrUw9I۽{6x1[6' hW=g;~a}?П."\wxw<1<.\jzDp2aVGdZꈜ3uf}i IQ9CBU X}R䃃DN.SfH*WpB^_.O@S:t>Ut]5R-w"*,JTbe𜄈t`6ݐ:uٛ1,(^ܶj#?V%[V]v2ſ,bYyu]'AZݡKa~z凜[(*ٵ]WaqKhrLN('nJ6X@H#7>jƆ G'. 1"rf_)]mnn`5k^' ŐuN  r´L>\l+)6|Bza %dņd1 z g%〙d}3+=TS mAJb8- #rd^1ae )d,(`4y{Τ>XP0r&ܩN0JJMG (Dr_q艔w`ƀ K>(k5xG'a#p 9σ$Qs(9 \yIyu{8cpg4l)ᱜֲ_f}L-LXZ }AE:HYn5QB6bB* A"D)D'=Fm FFYchvaT6ʒ56*a#ZhmT{AKl,8p boUDh$ C*cFj cF1Ng"技rB 8iQ7#fqKLrs}jE. DaTeaT,@\75QLUr7(3U ߫* ĜÞ3,Q9 aqwԧ ) Ni( ԆCxih (<$HA,ül{IZIGgsR-֨EK 61!f &Spq (@f`oʔ g@9B#BA 8HlXǪGYj$dzИiyH#%fG=6Kr F P؂IT%V ׿,GX=X)*Fx Q:i'Tkּ 6uә9GT_ꕷ7X^'@ $U`ÅE` ?,w386%\3՗[BР&ȁ3cXg0!.rj- `HpB }, :kxr;PVd)D%(0ir8+g İ $ATy#@/vuc7cl̦`Xƛ{yJçL~p{kHW~53RLtp< R z!&O} D^t7A % z>/\4byӭN#T"ӾG`?GSe$sPK5g3xm1ڔɳXV -w?/߽|V^:/ɨK4S]5tXx<\jhb>oG.D믶ZBM? l(.}Tۿj EyR4C JѲ ,ɓ`SrBA9~m ǹq=b% s.`ke 7w Rx7+Y]mz[Z/-.)C.p?I+NZ!5^Zzb{Q=^TyӫPN&ON@ j,8eUk+7ѽ(%8<k>dv>XWD4p,R2h=Ŕdp$:;%5邠{1aH*m[.ڑ:)$zX1}}+D$` Mxa$Ga?s5 r4(mq$ؐBuHuz8]yVnefrR>CiF?ʸj )h?61kkVW;L ՈL/& t ¬ㅧg~A޹&$x -_B;;`T<<)AUd#o)ZA޳(FP5|n mx|SsrS QC`GüE+ł*ZB##cb؆HXT"pj;i W`ՎX4W=}BHp1t' ,KEsJJtSd Pu'/\rUi`dxaAޫaC]L@q:$J U9zhٹxm֙wd@f16ƈ EV) B{+և`#>:ZKNugJq^ZY!զҹ鳦UIqӗ$w*X$'1dE*2B< XbA 3R3$a?SE'{ U23-eFûeCDhH_+7bǃ',Ǭ H(,@*ȁ]D(r$^$ tƒ/*@hvfg͑נ ה-gڙyPtBN!0y "r+IĐָ1w9gC9Z54LD{fI ^/.KTnz&,3Cpq~M͓1/V*;txFCZ~?$t}Sip#0uh6iWYcR}HbZw5V) PJ~]sϚ־ )ׇ=qsSԃ)ԃg åj~fJ1/*ܥ#E1eB,.Ǽ+jǙ3Z%3›*{פ])i<+e_NP!%N>Yu8.Nɀr 6^Vg1})'Y5gQxOSÌ=EA N×d-SkƇC<[\ݼӓx}I,hX g wLYp$Fי.y:)ͻȁE=Z,AJUR;ehC9ƒ OOl{ƑB_vÀ>vqdy|8Bb,:,!% ɡ8yf`Ė8_u=\)b ϯ~ %ʀTA 4ZB@rK?a ;K rW@zxg垧WG. Z~© Nމ;%US Q@<( փ5^  Z;[{sfKlU5X,^eф:GSaC3%7͞܋)FS`+!*-ǯ}oڄ_סWw*kiʉrDugV?|)mVOwO*77`hga vK쟿4g~DX 4=cOtB^d9I~(ah PtS0MVZ%U_m}KВXڸ fӝcZ<"=-7R؛__KoΨ?-o@5UճqPAP/5}J8?bcJT4FLVrF *Uf j9;U&+TQb/*oc}XhUniUM1Z\}k[X7tZfUPR \bYrVoZLC)*$m\ wZUv%Zen!Xj XV|$U)V@e>_ŢUDkۆj[jͷh**P5)ѯ|w^ty6Of.\_u_o ;BYNIӂx hM kLjys t7q0qNzu_89ZJX1do޽kf"U*L*ܚpWd1zj~an`1f7{[i;+r% C$ Y$YV4Ag̸E{DN1{=SʅpB a12`>j푈mXtkhO41D P#چԅs" `IXlPaM1 p6S y: H($ƴ'^I%8H)E!R %"̷7  1 eP,šan aB2h}'Y0/f_=zK@\$@`<B\G5`w*FWjԋ*W&aP*D{* эrk7ma-xp. 0xC ٗK?$cB;s aϗnEW; Vj̗ !dzthӄȮV]S㩬SL0NJp3G?ʪWcD/lKM/obNoetAY$:)x)0 T]YWVLt%4fݡw@q) Z-^t*nUy敖^͔o;Gpj&y9=(„a) 'X쭜p(,'amXF[5^P|N][qN5Ha}q!Qj"!dQ]Pi|ݜ|Py '>. ) є2ĤL&.#4,1_<9d~|FALYg$I0Gt$H@%W Ի@IA !8S R": yJsi(1Yp@9͟=2!|f1,o ѳ,$˥=͓T>[r~wvӗO}I, .S>rE,k rb)I .a)BN X~M=sѺ]$>Ph!3\N)P-E*X+v m-.*;v#c4aDa&(# A[J 8)"Tra@4H!"y긩FMuP& STa0M%xhbDdCaUJ)0#Jr+SB`5AS*;Ƶ,Ъ~CJ1nB:2$A4"C")Flb8ũδߐ!M#5P)F % ז2ZLSk4jqS* 3o9 ڶu}˰nfR99C rW1}>.S1rF}Ce\poe?u! [L/-a  <,d4 03HMVt7?=0/Vᇇ슾#Jۣۋ ,#BF7fjXǠ 4VQqG*fNWFT9{a<pc Bgfᮿ0f!\z.aJ+ D ə)x$R4cW\JW!b*wuB6,4~,2EHx~3cb@B ;?H푫=rGU~{d5M c6i3G4  JIS A-x&hc}:94vk}7ިތ͋!6ؚj'9Z\K?Oף[K.s]J^Lx bgB'?kj)Tzi;Y Px8 aEOydJ'܋cT$Sv=Es4v]2qzT?aXUM.Ic*%awJO C>M4E" NIu[qc2/*["ak`D;r;y Ҥ96LYDf.-cG/ M&M3TY#Cc Z-,VMqn*B-*qgd`7l(;V4ϲvT `)KZ;-IqIRPHg|urZ"(TԡZA!һT5!oEI#}PrS=E rHRdI0M:rAТL䕿PJDQ <#A1EdR3@"sNg:X6b#ے1RNPR!z{8銹EcؙKX+ӂ #ı65˜{$K"|7ٟw mw@$OH=qq{(OHJ}H $g緯D4WFKьi9Ơ ʬ`]fkĵbl'+EAuW A-PH*{O-XVԑ=S-ϵHXb/R_PTɓHQ6%ORHU^p GL* J3; ꂷbm1BI$Vr?q ۛ`b*=tv;TrtogfbW>zϟ]x,D$\ /3BPėk4eF8_׉T\#yA2(DP6H4<*;Uh "Rǚ}<$X)־tl(㜴+yB)>a˥V9]j-,}rǾMW, 8*bf)+6 "n .)`!d)x1af|d%ns⶯W9mbNPza?^~/5|z4{[_=q E&:tS0L ?NVF=.Gp}r?&$74(>sRd3sAC,mD»HT;M:b2.0mAۉctr1XHQ2!JX͍J)RňqX qpj~-9蒝rsj  '<B2np*M%`u'οYX(vBwL3ms 3T1 )mTStjhʩMBڡ̓{$s}J@lfETGsBRBL Xq,6"HCϸTxi˭\7mm )_?j Wzx:£ Ѯ.i}1(r[-r=t%宀JRVKJ~:jԀ fSg{vPh Mk j:GrV334!Q#s'59m|J%a| kYkXp꫍BǨ/^ZTk/DGDl[P NSHbo]uOfL}5EES="9(8g$C>ϱ\} 2)Fa\R:?GIbk&8Ǭ[RD.`udy2;$04v.+Gu<]3tm_>4n^> QN0_D(7bD;̃4rcV=8wZOJ"EHq!i[_ݒUd-D4^f$ln~ɋ Fl_;T7dNR_ږԟ:MI?لbFҪGt:7@X:Cjl'$`mN;5wq$a<"zf=<"GHҎV&ٔT&Fzh"##^f#m8Ԡ88~hn1.3nЍtnSJ{ ~1Snͽ)m+!x׻5h:h+3D+J8S;eJ6j>Mڛr߇~?2U7h!Խp?II\ruKWΛ3\2Cmy7_.k}l] >:͜OSԒu/޾wHvp4.`#U~3x)|pTO_-5z7]Yַjg@GR@55Dy:0KKdocyOH^H ^eo>y?[ֳM]fnr'r~;6Oȓk3}8`/_޳;C?r΋BWCЯc>{p}{ Wyxv>%ϬmSo?L%/U6iJGIwj-iJ .zjWZU16vV]'k@˯qvac԰eZm?tk_؝ᾰoM=yx MP7XD'oҏ\_uˁ=nW{۫>`[5=LTy}7E;! N\,_W ^?>8,UÇMa[oܽ4:p_joў}J}(=V6:]6'j~o9Q7Bw}kMlfB߯oji*؃E7XzoE9|%vgGTc 1)b d'N)Jw|nM-2ل8Zo#!8n*t{ic횆-h)_9xo'φ?Um n:C >q2|]06篖_i\0SQ?NZGO)Sۋx*1OiT^axdAG*$ѯ.ou}'wK~y\|}z0|*j{37ߺ7c޴;=֦/dÙ|?Y+?羑[`4(gϮ\ xu$%wi:cF㏵S8yt/\ 8{zG7_jRjɫ=ƞ]XDɳSr'sv_ c.y*k\jrK潱5SJTUl5TB\Bϟts$>1OMϺ*ԭ[" URvִ'8鬜%QKy3j37a 'tkkU2$Z@ػlrW蚢qM$븵"Y{;Wۈ1Jhn5'`jNդk=pYwf+/}wVLL(5e6;N1X&s7bs+B\ܓ @=LP}Jf ̙Ik%Č6i[Zͥ$M xZ ]t%Zd}J/O85A1ŊG Sa)Щ@8VA H}wZ;=it*e3L| 0PWÍԘKÄm1bļeݫXfzK&E}쫕RhG-+/e<`ʔz)ee~&hs6&9}:[>A9(' ,$BNF{PRAscfJ(L;(ۨ ԳZn[!X"xh DhXJBrɼy_PAY36:f"c4BQRFEFg44S)k B j-,ZQgyhta0`ps 7q%hˋEqJmVW(Dh <%d; 2A5R2PC ic֣:dЦVB~@pCl{6S S'lv׬p5TM@ 5PF!}UO66qJcV}TQ&Y鬻-``ĺ6b*b 5hv{m7z:}sLYu a ҳk>ӥzp+\]m!B|0P/ACM .el@{.r3侀 1wYؘC4PR4(%E7S1 4 P1!qqcQQ,i=`(DW`%.g=FU]@bRyah%mN8I n)Yk ugj`S)Ԛ/@8#,1Ej>nd0Gp+$AtzdȂ-(uE\6+9Sբ<8r` @_&8(èedI=K*$2 + GR}RlBPq8kfSA| }͵E%)yj02LPj92nTeld,`LB pH2LX 7rLV{deAy!b-ap(֭b2.wJ2 Rt”NZ \F10gj1!]虶RLuFklhDJ -lPу;xԄ0n в4Px/Amq+xCr51Y5,b#.K Igh%F[(2xPE.c!ä!Pt̃G>)&[8o@H/ A; >nI$,.NVVمX=WIb Oc@&dЈhP2ǠށKrm6^L* QTCƂYB*(#L/B J iиw0FOZ%\V)*Ku BVz!b jhaX3cs#hu tg1.V[GuŠxMuNuMl@؏ 9B~G?y QЭ dXR.E#HKAGIzH 7^ցNi,U?Yhlj#{4B?&h pNpc[qc-\iҪVX |ڰ3Tj\WO2 =F=2b&F>^txf4$_j½戂sDW[˛kv5zWN%[l"譭ޢsۃzӆ>XoYeA1Ö_j '[s2ðM㝋vX3т{׹d&ă3ВrK9ޚ;oݧhM1)eiDCޛ#+/M7P2=AVAGŔ|L: ?$;+$U {F23C֋AcZRL&#T;.T_` iD@Ӑ:8 Im($u˜#2v~9?[nobs=mgMb˿:j^θ0{cͥ6Sdm5ב:{3s\wܽl&eXzs?5/~j^e -;nN^?'GN~v|Lҟ5l$3YdvRV KnD%&oQ5N5:1e>';*?KkjzlRMaXqX{<\X`Esޔ^p? DUBHBv\t09=Q F=խ_) &/ŵԸ,s Tei^3Zy:1@+=vJ7~ug7۽ϮGRI-3fw귷 E;5ogqpJъn0l3pnhmv1[Ze%ɶDcp0J1B,OeY dzꎆޚ$a] D .S/h!X}FƸ2[ڰ‚bb,"K܄QNH:D L_nqn@5uGb->0';86EOx>Y~#z_ff+O j/I@0z!]w<DZ \Z+~_SemQ #BGݛ쑉6Ll" p0b %SC tp]."-seuu|;$3[QgmkO:QZROVQf%%W9y)/\T+iۍRMQhrQ\{^T&~#wI"JzY՘57Rșet "Ena(u҈Mڄ&u r[Zj̇NaCnn#=T Yzv5j7^ڙjÔZV22UUQ&Y&3_ʙ2>謇XH)vKl/P x!X奈nrzqv4>OeU͞4G)|WayԻ!_6)z_4g֌N1n]TCzܻ5א/\EVq>nwkʃMTx֐n'wkCCpmS~-wO:n y4c+bsA0gm`4fl؝Y[ Đvgu;k!qt`3!2F>8*qLaf>fgQJ ]n+!aND[s- kԵB5\n"Ӑ)&S@)- cՐC`r '-%TnLSLt OpU&,fZg }Gk2$QCv&*9UJ ְ!U UFHZk%4JDJ7J{M̆psQK ٩p܇\n:7̿_Կ|Os7a۫YNf>râ³|Q!|zu­Q%ytLv?nI=o|r7?ro/N sc.yc4~uex=Kx\[oCǿOy:co7]%o?:$Ep<侫ÇG#~0bzOsVIǿr=H//[hc^|jE+fa K(wŨ&6w Uo+,5,DdS `9)K lٻ=<+Qh9aA81vT1BHFY 2P]B^"ǜ o /D o_ Z=v=ф{l5)sjظd݂' rfO}OV^اzZ K=ZYD:FCMYi㼵=1CԊ@ii{V|8T\:QOr3fj 'OgZufrrEmvt}YF?o*o?Ջ|ϯ~&xh9My~<[|._MAOz>m5^>_wɏ5/';?fDq?-~~ڸ fqO;?fe-o=ޟMG)"^ު6Q3^([lR mKdeO"p Nd!4>.d==S֚:󿿿! Q7,yD44}ūy&$Y$:\\nK: 1^DQf\AYqP`|SDWRjjۃn P KR HZl.u^2# Jđ0kݟ ˡ(趏ʣ5Gd1>zP7gY!B,/w;ۦ,|0< sO*o[JŦ@sS \yR̯A0NI,&hYX+讋*`[ 8StY \Q&X'8V9+cVd<6x+23 i_$\I̵{TL0@چO+e`@ dM)`\yxWN"jflXesDd!22@Dd,J 3@ >*$jjPIӪX |UI NJ f#0 l Ca[!İX<hT*XCJxd( )D21R#5 q!vuR`V#eM%PsUmRq7'D=0+AŋW):k$O[XtLfr3%Czdo"7i7@n+$ "?Ld&3yMl]kMtBYX+[q̽[}}N=RG! x c!0A݅Usm !#CiXnrUs<$H죢9nq=ArvdpUMb9Pb_s"P> gC`1$c2!rmED6RG"5r0tt $éBJ(XXD YVF L+ #t]g&218$*Eb[Mƭ`]@P襫%,#5_AͳJ6J' L^a%+QrV1j\*^Ck0COU nX"Pdc +#Z-50fj>o sL5UG@ ay!XՈZ&*(OFrDZW"nLH#[-x,V2a@5p6D޵q%Beg|yA d`[ˌ(R&)[ }$ŦxfwSvV!IVթS}Ul 5II<ƅFiiOcHɊ^ ^\oC ‚1(5 &X?+1$Sz!̇DtCdJ298 8ZPE"ه3,`F;(}XXeE‚{l+ELdNiQȰY6009 IC$i3`܀,  d^:&,H&(d}FRai0,Y*$I08M 1ArRI|' ˬֶmӫO;<qR)\3ٗ 001awT;oah>tJ@VY"|iM1 DDp\#=f Ι$xc[ J)8G@!dу<Vdf3J 1bH^NKu  p0Dr 82z'u(, t$fD+ Wdax"ZD 6D1)dÌDNaXqwϼ A -D ִ/b L#\Բ0gʘ .8 eM jP*bIB,B⠰V uZ=HpK!L 5n8YBDFrUeNmZ: 1\ |4L7ϦGsUf&$qD0uaMq(Fwd5Fƌ`M Q_"bԡL0 ,pRV(RV40r #-7f1%1p7LJxKddUetUC܁nX1? њw:)s/%2Pԃ `zdiOaHP䈰۬ɰX>O.(dE^aH"6T''A"'0DfntG +6a,Ӣa!$> Ɉ CzZ!v),K~y71CYhI2܃csr-6Zq0s=1LKEY&Jb䃡,T 0iCDzjS^❦-8EWǯWzraW?,ŧŢo$;[3zQ`ݳ:X#ʇobc^티Zu1^UZr̫NZK[m]XZ,UR !_M@Uh i:IApOa?br &`窅Tvz޷l@Lg [j?3;ߪldg7"no9׭}no4ZU߭Dv^`~A2ҽ7RK,f3Nzf|3N-Pp@'l:;WUrl(wa|9Wח1G,?a58|c?5^8RȻ$g kN$$WtHI\>h湰:"!p$u\;-U4F*3k yS{1<m/˴$ȕ&]V(sx*?ix??3DMr(7eܛFtn.[0mEth3_Ò(ի͇\/ "rG4K )c)oc$ ʏ`JiE^s.OcQj``8WFɕKWdƁ+|aAf#|271Q+8H &`,4s£4LE^0GY2%*|.杩&^̒!FUp\CD4%u^⬢ OJtK|pY+1$1IE'uhyEnǞQ:2"I+>kpb$ ,t7ZH`-  N`2oQ$) 3ֳB6P#r߷ՠ3i:4Rs]c5￐SJ (Q)eIJ33Nq;SCcp^1X 'iJ\Q3xmjSSYұVuaQa>ʟmDʟIϿ%9CKޟXDM>< Fh>>,.J͉_*| d*yE=z%أ+&TW^n0ׯdK 淣%㇛djLFRPggP*К!XEavX0.f^'3|8Nd/ dC_J}L2Ep|ntLVq0X5J^NGΦ}0Lѩ+ùv?  Д!V{ZnD_9mHG5م߫}+C+'8\k8u&wy1f`O+=$5؝.ީLkY:e%rzer~R!y_/.dR^(.u*bvdrZ5ib6 }Ňj'޸'\d9YJ*%{7Į}eJynvJF["D(4fm|(z:?ײB\J}䤧s I<߈wsj&D(0!J.(;:o!&'zxڞ 姌O"z Q0N p"a|xoV623CU:o(6ABaΈ0vn @J]7:n՗_JX^ҟyѱV}}2֗ݏzlgr[\Ϧ{ ڊ~eiO?Nʉ~i˯kʯ):j;8} iܻp+:sqύ=\"-L{ؗn책FRP*{ȱ.m4˷RVKmvi2fXWoSVf+B^6HL+ @cD550x^>ÃIVQGuHuK050ׄnŅ+~z=@&n,S{DOO*/m TJn[moݯL3Y=:i{\sS7vn}l 2~+uQ")PR\zZt$^*#<>T_O=9h1˩"=[wZ0aɅN^55=u~_r9Ć>n[X(4e{^ܱHXҎ8)-Fu@ֲ(]^c*xG^[S^Eu2Rc؅n<.\R"3G/ek"ڀ.,+9-725%YVx8r7nHKE+jު.⺿Z( KXM䒖>6ޚR_d%詇~J(auO,8 @~+ZNKeU^g4$B>٪f-2FZ20n ɷ\T>/BR" /Bd]>y@Zu.iSm 4J}nwlu |x>_,%.ir VkFp0#e**NHC #z>hMTR 2h$1c} f = KQ-$֊3+@ٸwwSjc%D!&>o ;JcvwM-oۨA.E=u&R[!l)qxR*\M4\e^ dLD܊"uK'0#+Cz{+}{Cg:zxւ815c0j\ٻ|!_0OO(g|dx ]4b鋝ؑF>8HMӖsC%j85rk9 6ppZyPq̉{VnV65t WiEpBjbLOcbu^𐝺hhfS{ASn:'|_)DhKy /OPgmVtS9|?C){NhwnA%t,ǒ(ʾE29/35Nkȳg/⯭YYWQDFU=pS\"+<@L(xՃᷮ*b͟bH 7źPbobq!yVk&Juϐva+/NrQq2*Qy>y 1(^[m5 x*plZ9ȼ̅'SkT~ 0hO^B9FojiGK<4B=^YS2 ;*u yJUk z%* uc$ڭ#"&A0O5 _.6iI*TzjN-aux1܇m[? %|Tvt;x:7Ŕ49ijeϋbKXka:fMe7[$~]dQ@򩓼%^gٗ̚_0ė=󮿿հWz\^oAޙñgW} 4@ {0/?FZlSz\_p~_;% }N %- -{bL~;Jc.y^q{L"b Ay퍡ݶכMW4#a+ˏ?`Tm`۾Bl>扐D}EJӍgK J x'd~,ԥSV?]_(P@=`@ 0$Dp/I\6RhQقMAu;VB5qǵLMoaܴ"h)o;:4愖Nl{xh9^r0 c&,ȾBW/pK+L Þ|,EZ$Y{K^cRfow€zjy%+:kAρ<֊m=%S⢥H }qZ }6" }J1q^nj[;eavuF`njT5[غpkn9[e$l=y=՛e(?s8p֘%SYrwUk7M 8*7KӞ9, ڜMyKocgQ Z}3uql*T:f_њ)OL9a"AUI3Niu=;O$@{lkީIqFʋ`sqb78Xl ŶuQuWI}s<}f |jyع=GZsVj{3cH>ץ|]Qh;i XxoL;'dXvd!BmήHP(-+ꘟHX=Z$̅6xe>0]YZ{\\vӝ-qMۍ3-q ځ@ BE׫ή 7V٨suGmY 藏,G/H@N|4Xn MF9bjמ`Lئ?|i Xs3F ׽|8^n%ZQnCJ0:N n{3 TEmR?e9J bm`7Ahnb;(M/Xe44>p"?(vT}ӭ/-j@a{k}yj@x2w_ s-U.,yjϣp*$ TQ0-џj`#v|<2 eًJ5%jueQTo~\pWtݿM+B 7jmV=^`fCF|g5&sĐºf˫:BfOln9.%]l'ʼnvD fһQZPx;j}0; 9:[XT+U$cǩEx(=۩1(pY"lyv7(@8eYu \ G O~{0k}6|]bTpuE( =s T2 A=IgHrQL"|$Ek%j!.\^N9h.z:Bɇ}`pq_V/[/Qb,(ĸmJ",:h09T@)]i+a)Xr%~Orw :8x6G@+$Q=9R#Pj"i/Dv~dI&hlRR㎷Wu N4 _fc_cvP9Dktra5| dWeA֪I/H϶ڬΒps) }JI OԲ"r ̻i`V0t%ٜ5G"NHM_9s&wxsGӻ"Gln{fᅹasᛴ.^yr:<}[7NJ;~<4en cb_ pFiʨmh֭;^WkN@y #RqF#eD&N\A#Y|ߛ? g3͚{g9%bDʀ`ă0i"@sЅ{FN{r߂{:*CR'Uz C}ZJj#RR]oW?Nk) Wԡ8XsV=ï~s"%E3)iiI?~8 E_xLx}Or2/Fq8j~c Zc"9%:%b^5Rgdk# 'Ī\Θ(P[AiNK9ukO'=>x99پG5d [ N <<"/g'_c*;;3wlQoR,%IMcm;;s,6~,lKсvss]mxe bD+mk*hZI릴I%FvSK1,6;eeu|u%ԚD/#eWl#fi5wyVXOGց"ڵYi0Oׅ?p tiOj?ҋy8}ŔYCvcaF0[<.)WB2NΈUHO!JfUߩʢBK7m La007!V턵eDt7EL @N1fV^B+B wn u^3 Hf>zlvn6勽Pm 崅MxЮ[EZ0`.m1Ag7$-B4xm7ͣ"kGNkIc%)B&HN?"+{ZJP50F!ıa(a`钊(4BFC!Ut9AIY[ӓ%,Ϭ̱ [V ]V{}'|Y@!C.hJq&Z$+%0+ֵ-qOУJU-Sxw&4?eډfހK^ xp8UDc>mi-a'­^Vwϧbc0I2=A ~/`5U[/uμ_ٲVX [\"5}O+u -Mk=ӆrM%Fs=[8K_.uÐqS$c B(Ņ&T(eXhH"%H1T:*Q+Y.FvG\ [3'A8ZѰhe4 m-J<M ~(|҂Ti!IL1ZI2 LC%(QPP-F|GnYŨ}wm~ &YC0;lvM^N`l䙝 ْ"ž`.v[ͪ*k-U F1wr꿲Тn&)0Z^ٻj"|R]}l6snO)vS&B&|ipHE-avcI¥.ÑN:NV0UˌZZ@.QhJd03 (Y1.uiV@ u< CpqxwJ)3Y};g)v47͇*{<_?U-80)^ՁJˋ6*!(upuj4&-ǀ2dgՑ;D0Rk5ɤ-f {z>'/mOE$D?>}ƋH_wO9Ιʛ "U #}iQD87EJZyu*~.ygo,:tQZR)Qiɹ$ Rc(jͥ*)7g,Kp(e)mNfH!-羜 @ٜK_j4*wde_ R!d'A#ɒ1ٔ# Է> Sp`\:J5L \x7E ^Ɗ"] .J굎.FJiTV9#Qȕ<"t{$ N%`M%o{2&3ݸɎk 5O,Ɗ"yC%lM0#>p_:yP]ͱmvp=3-u Lj$H\Տ$/}Aj (PdZRFxtOrC=w"f#ٟ؋a[^)seC!?Rztx0CjTGGQ݁8{` ! v$JR1.vRL )1rlc!>H½AjCgQ匢䈻^kAz5֦r0upCn= eÎlB٭tXpZbaFegaH 19Buv뮞#Rn=v|X[<9}bM0Ewj-@D5vAjU9 u~kU|!֪Jd/LH%g31ê!/!J#ToF~kv<<^/Ra|\?Zn[U)Z`y^ٻUWâe{kuvt>akݧ3xΓ/QXEG ~fwi]+~в+) tVL|%^>8* .g=W_ՠw'/fM '%C>!]x@Qc]|OUUL%z8Y^Qe?+@۱ǩ[]{[wmݵuu[3O29f2Q/L +bZ $1n pߕbxi>V\\V)U#]M(-gòl2._1;dTHuT_?!$* ӆIundܲ"K) άR$  R0Ef Mc8Kx:Ni:fTU:|.BZFѫxՉZ(hb.cF] 6 {r!⛥-A)+dM[ @@)$[Q? ;KP 8df6u@>C<.86[<9Tl({|s7Ĕa_CZhpZ+Jab@+Ffl#G TpIH;2w$='7"Hs=0巶܃twh'k3eK thˆs -&9\Wu:SʙFab9}>KaFmn2)ldn(JbIYk(G Ag *Ҙ¹qc@t- %f+|-B劒\\WA$.LrAr?)AK[diԌ5o4]oQE'+ldJHVBy"v6L^o5jL:N}G }!{3&mC }3Oє966nѺ dꤾ#ƺaqB-ڶuK4Ժ51CMSn*t-U!S'1֭DTά[֭Ƨss%iQNiKrB>ag' Xkh~sjOzKEV!jzM^f`3ē< B+Wő%Y*(4'c%;o',тR"aPwem$Iz!@ 4v6;BԱ$eE*^f -,,Ve b!-JW :cއQ/_B(Rlta\T^Zn*I EA(7 D;K6@=_L+;Pݕpunfr1˷L&7冔 JFXjIt +E~áV+^U|ް 5X}ՆJ<#T@)>sR8t'uMQZ3$[ި6I)R2au * I _(zcA VFUxbQb֏ vJtVۣQya4*fړ\Ծg n|I FZn=ECo:~JE@VIˊ$&݇pzgt{BЈ f*!L&2q84%ythb86?y¨<--xsf6  ~[6՚]4cvw;-CqWNjϞ辈~O_C SH\v,R{YG%bm׋=y˧PM-Ct a)]- (}(irMA9 jAz"|8j^-{<1 rN# g }H;qW8hz"=n~y1/-h~w?gAAWWeг<+ uv+[O ~ bHšQVZ*jm?#ğyM+?oԟ+4%/^)6a*Mڇ?[D#;;V-A`y JYoY+TV${9IV]+Zwqs#3,5oy1Sv|\lW֠K/+7 Kd]_Y:hHeE,eEfWfUVp}sMRZJH0ڝrP鯟wUx&? QHtf#pՆsTixc $'3uޟӯ@A|2.^a Aq)*-\sU]wE4iFi/9ΌdHH0?fK5x#5,yWbq\ ^LvA9?"Cl=/M2瑛^UoPfα`-gYߜe YwSWi Jxd$QSYP0L! Yc"4ggSlO6D,LXuLyEryyrT0phPӔT?%e%BXe C@kkye gIk MZ8ڜs AFrςDl Yl^[h*5hai"i#pbr.ZsnƇ}J~?TuraFW:zfӗ^wxKRh*̖`^5#M&5Ѣʵ_yy`sdLL/j~ngKMږAsO\>BbB7|T=%A ߘDi`DtSj Z@Pgc ROhJDz" aECTǮԙB(ڐ9/5:;lC,OxΩ&D:[Y )S B|]ܳ匿6WK yO0$90"ET ,ן5ØǙ; Dn@!8U`5Yr͖Y) =[=L&o^*˜]1$:\ؘ\o(BP<"[LCF6-F97 s5 9GJaSyݖ,M)W?!y -`=|Wlyt={._[ryDA?*Lc?9z]QkWLv4 zǫZUuWw߾9xޙ;.޺9{ז9^RHֽCx!ⴲ#u[Zw5fr 6_{Gqs3Z3a~VX2{3shgs:3rv gvӥ5Fz 7~2/L `!g!^|2]ʻ|}TKcgu<:<~T*K?fE][LN"K[q*'cay1V(_zP[>в.)U $guzV(d3ZV.B.1) Y; ӐuJxu0+; 9w774$AU4ai%}^m,M5Ud{g P|$GN4V& ~I#p^X c>pNHʽاP\o!! L 1LCC;{\il#VY iakna-@7ޗچFT{ƾ[zC2wj/Z)*4#NC?%jbfH ا3gTI$R %qZjd3nb{PDXU^D%pt)N HK5:_C,M@֩v߭,euΙ8jRsj܇&~8uϳS=?OFg۠t_y[b,.}NtjΡdY`}S5!gVJhsT@)>&inOc3D t.&wmk$>ۊ1zqK\?Ƥ\AJ*+mozUtEbTx?fԬoh,qH>S3a|#0~~hߦ-ӺdbA 0-TDZ j K* 9*=vseӠAU!]zZY+][γ6v4B°ώzAܾSd#?[9=_ʅѿ<ɒ*3j'a<狘x{W^һMP}-Y}o3~(QOw!  $XmmS۞"%`]s CZ#7cT+NF=L`;E]u5`vJoB +N i:RuO9ݤdd'(OV9o>ͺZET)KfM@Nά}49p9("~9㻀bHVh4m˱D{xdNy `^tJvֵVTt )]:s>V .L*A Eiu8ۊ Y!Dx:j'y=?__g JV2S%' 0-%QA/4x\* Wc~78IDI.&꽓yCi>`~KҼ+1@_.`I&D1h!Bۘ_J:8b2ӼeIGN!9ֆf槲̘Ȑ\r1  o1RR1#y@4Ff%^ [/{QOR~:Fzćsk1!N#<JPafBz"61ǕPcȒ׳A;qԦg=u$+Ѭ @3ȱ8!QZ ] '!׋Lw^zRbaY %}zm84ӫN,[C9V-IS8:HUwh&>rpNߺW%1ֈq!Mk֣6SׄmII`dGdT xd(|GT`Hh}G1BӋ^QpuR3#Ra<a,}7n^J5/ TgbV8fOAH C,XJD03Ì7(dt3Ғ@שD;qw Ec 8c">LF50hD")Z\kf/ͲF 2;~MtҴ*Jp SSpC+,ViC 7hu qT0+ c[&G=69ܸǦ?@=_T&pAY(n{ȳ0u܄x&Kk#狧 i;a:B5Œf'x\V)}h`ٻFnW}7+1?3N@6$XYɖl:,ؗLVTHazY` 0 e/G >C3)a<$bINyQ/(#%{eʭƻ:5j.n[FgƳ㔋o-ʁ2̷[k8zoWX:uU-S9HyNf) CWfo&moMzag1zɤUB66k 5}4x{qao>Ainp6  n/:x0Q8zJ= +/?M q̍rĸٱ+xڳd@@@y)qA2#̌|3" @ВBB]Tt¢p\SlBaG ИZ(D50:kOE8^N?Y׽rfx.9cTQ<Ĕ{k8 }k8ͯY0dyOA8CQAU @U. 3FXKF(CS81lmIR R: BNsom$omd5Nӂ ދLO.DlXh%1^2 6 (Bd`~"92Wj:^dK:}+ݾ ,Aj pXQ Bqč a00 - ` Bm/ˣ,9O#`=C3W3k(2GױߝH~o=$ao=d5 C&VM0eqV4™ZŃlJ (Bɰvc-:ΑQVPA) PeWPC"Tz2c ҧȬȬQI hl-2ՕKJ9]wD= 9AОL hOyU~B/ _1F!)[)ߎUy0KCSln{h1)}rFgR+?/EElzLnX7I'v^!!+EE׊B%Jv0*Hv1F #!ӕ/ҧ{ECEaߵ3{ UxûwX[`jAڋ*giXp<$l[ռtn0p?EULBopEy#9rt\>3 KW/쪲zϧM+Z1u] $_3ܘCkhS 3XT*qt\>T"fE~ӭoO(yd.n}}p4쾖vse*?!"75BͺP,Ԭ;kkM:,[ `) M%€C44+oÑ TT;sڙQ݈J{ Erb6q$ T"<_S^ - dR.2RU '؞n̚FId^-סW^7ȐM/.¯+W0pv ~. qcf0Ngc~j]fyUA˙^7IӞ,: Dogvf_2*MN0״}w\s]`)=V~5I3kh\ RD'w*ڭ]ִ[6ڭ y"%SQVY%]&c֌Գ3ƻI$QW%U[ L) * cJ'a-ndYgDJL LmPƬ20S%7B'^VG  P! ff5 H7P5 ]P=۩I +Vi5QpXJI(#@'ڥr͌FAjuX#F K-qBjXbmBLWߢFApYSaTӍY(بó\ RrǨrV )ˉguH3,<mh7H/[.);FI`)ilBS[E4Gln{v EtrǨݺE `nZeꐐg.Y2%ܙfw=8|Y[ 6O PaZq< uιq [ (` T@1{%ECKA2%p/ fIۣ$Z:$䙋hLQ{EjA"z`Fs /bV/ ^Z B8.85K !&2sBF?Ҽ? zUA\_ pq ޥ7RG5;ǫ׎Sp;x}nK`YC׭WanlM4_IɚfpHǹXʁQʤp6 % G*h&䴡 ́WS9ɚ:MƧ.DSGLYӕPԢ?ݘ5%(ڙA;FMw)핵&4uoS\)Ǔa>txMttI /x]뫳 l1SU{(@<}?Zݳej:?fQ[lj:_Rțw(">\5D ^EIH04?Ns51M@ɠvT^ `b?  qoLwa* 3uP5wZU~)[.E4>p4\kaDfs!D|oS*_7$k%HmݺV_fgc%D̦+.VˋbX@PmV[I6Vv,]~7S ol}HJi:[{,Ǣ%1^X-Ple˿r p^,K.o ŌL /z*{05P 2@Ë ! 0x*d9o".ؠW+ 5N2HuT3ҋ"9mT!N /(,FJr1bj.8PM."-2^-Z X 5[GHwGHv%]?0sѕUL16s8jJMi_H}` G󋱟͏z業;1?xg< Y.br=OGA!ԌSV" uO臈OyC>؉i"HD]&xg3< Y b0>MR@^PNR8f ԛ}pPcVo7P2D ܁Ns_(S}PFC:!P:0YΐΌr 8o!-D1G(7\*0Ҹh"e봶s( F5`PЦl5q!-wXT@J֌$T) }l0sȿ?F7bB@즫ڜ &z$|d́J ya\a͒(Dp,$)=3u6jX` gԩp;K37%>\}pyc!Z)X0D$\pZ ĨOMkOUynf]8J1W H2h$|@pR,Hd"ʹ@>UVbTjPv!ٸ/`FC+JMSupFQ XP)In$!%R Gt.½R;KKʥ6\40(b&kT!tQe]Y%d,F$u@F[V+RA! fGij~Bꍱ["ihv#͓jjuCWvn>Ev[Wwoަ` =6R ~wWthbw޼Y>'`f^,$ֿatWg0U^5 u  @}wؐ+Q\`rqGbFRy6,#CN0|ujvV@>\-~w3ֽوu~վwv;5.MvςN0ŕZ/ys95B;)9lbșFiZH6Z*Hւ0]X ɓsT Jkߙsp+' %OZſ[4hyM ڦL*h7(i m7)IJYm0d ͱ∛!sip@֣t{T]NEo4shu1Xf8h*PCQ+ylV4[=FHsKm/i l0{M M3\ʳr%q ,tPPLLY@/qL,QN,*Z@e,DơGu-Ɲq:6;t8$geMhglEk4]c!q[ܞU~ZB؞EJT+W\R =]N]ݥGzO }*հMTw<*5nC3 c<tzHfF275R"8[l׼S1 BU^]'u֛RmբsЫ3q{-;ri];oX/ߌm>z?q̠Q/y@.7n98*v;a=?|gfPʂ¼+Q!ȭ(W^Oo`4r|{#iAաE74ȃ?-W`pO O 9U8Y-Q!x]5jj5Iޣ8hM8itijߏ?V`tU>ݷ/`>_+[4j-5ʱǗkǪf:9//)j%/hm=ՠ؜}'S6tT |MftEp8q돾uCRi㋋{E5PmE{Xԏ r¸ Al N^_}/K% d>PG l 9e5|34ph<ņI}0?F'{qOvbiW֚'y郳q=!Qwh/l#K7lLc?7_}9}懎.n mۖ}퍌[y[#Ę֓O8GMưw{xvSx zf0ŝٺу3v ^ ZhG{ _=yOңECW>k+^-|nPa8~bQ#;rI*g4=i^38:h CaCɴ$KʲaL8P뜦AD"`w4mRdnmImAv`*B@M)q_M=0W#N#۝jlSkݶN_N-ewµfC8_ݛ~A\ONkQY|}mnys=\WA9ZxfGًvb^,~(R/c/|IM2Mnry[har~,R\'|Wŕ4~~j0ezS~KOf|;B {ܦ';fqɇ΂CK^>{̱,?Qqz ΟivkUV]% W@T6YKI^o饺%":nX;Xs/>%*&A+ 2|=>^9!X ^ ^ |*Y\4TUح%5eJf雰=?;sop>@}B|]&|8B%YIA` ǝ5Qj ɇ8VhUgE]ս1Xkqp>4~ֆ:<dP8'%Vt v慖Ӑ^JtZ%:m! ,-EA讀 NMZ2o$$%40XLW"Hf')MÜzv l LeD&K$eh=ƤGBU1Y"9͸6TjAJ@*C$ItY^/C q#hNF<qh:|G.6&ôe=pX:!.8r{[ktDv26O@R o\[p Jʇ@9(o,wo|URLs>҈_gG{'5ީi;U c*lHbs GӫI.=sTAP>=M7upef~םM&ή6}9gjE lom+CH~lVO. -`YHpa̰EPBQ+JR@ \e#IO=-R"Plco2gQgW/87a۾uoھ1o㱶oxL9Z۷8l ⵃŵ}CYKDtھEPV1pGeR"tP˹t !:"$I8 Tؼ[zvnB.m-y#EH@$QѢL^DlO< N*J*"OR熚SPSjZ /?T ]d<~Zv*63!5IɖPsz yc윚z AB;/vLG/+吼S&mz`I=f6)ޖ ܖRH Biwv݉d-nI1!x h#&9 O-Rബh;DR1jB _aG5r`F6A>yD[޵Fncbe'@d~i t&`l&moqUw3U.[\RQTCEsyp.%"6ȲV*=ECl5j-%Q<Չs[dcKww `&ԶYINzSp\q({`ci%/{ v5\+{O 'eO ƪ)*;0S5(Ri]6ּ!-@&] °`Bu>%Du`? Vom;9ӈU;ڤ\|| s0$`_ KEMceD~t}P;NZLF0`x |ED1?>{P SOxzEMgx!6Lԁ[ʢ EYgJ:U$#Ôf ԂgEfazsO!/]Fo퓆Yy/aK3ϯ2* /j~Zu~nM[k󑗖'kV=_oo֡gg'?]pu8O=Y7OB%DaʾNq\= ar: ¶֔FֶŸ@lz}mgmZ6>iWs^ݳX׉qdb\>-4RXydp*mB$r:\u8zQS&/\\yʋ_ci3ǦMlI{z`pZca,&F¶ǘgƘd r/Z4:5r58q/9FT'\4ϸSMU0/{<ۧli0ͲR~}V`71e9aS紸~n}7s=g6SO OȾ:MzԇuQnO}Ḑࠥ! 4 .a8,'G2E|*P` ǾpL;*+[@8hSV0GG䄁a Ĵd0=> K_ O.Y=Z!c]gH~Z>'=+$^ 5+TF< @zϧI{}+wX'29kD<=5x5$}s)eW"t?=8x'TeTpE\"SR/lt;)_L>[ E$ ñ祹-ɋ=ubS`r 7<4_~X\} ˑKEGMNx&$&1O2p'1Tɡ4IN U W#ΡOox5fU!AFQc]>omE]Qߛ#rf{Hf\111j9$Vh+aCdf2J/!g{LItQ+"Uh7ߺRyz|?gW#{ϡ+ťB#-#ҕ (}TrN$Bom@OLZ5>G}YsoL0 7v:!IR;ӉQP >_yfM# کGѺ_}k8ao3RX6|?; a,ic[O8ϫ_XKzmqЗf60^pU~iyG / ;N1/+i ck.](|@%³M Rd8tmMCFc8I l̫)p֊bPȆh86:b'1xVH~t*xLr$L!ߞ Vh`V:GdJ12BЌ#8i\}06<8~s>FFc1r"Tw\@z1ԀFY3MGȋ@{0j^p<=xAX3oI:v}@Xx&ڨYI$y7gslD 4ʱ5B(Q'{"ǎ`2UqyUh=ug6c@d9׈a\澮ޚ'$׎`nkGg4֕ x[kM{:DJCϷ; oYe;>0OsΠv4/(tܒ NsQ1GDMf_[| z/e>k|2i2?/(H)>;3֝x B0*~$0 Z{T˓hZߝ69tz݋ݽNj6绵V42X0`%QHVSH+&J]ЏN78f1 ?cJ13B^:-0jgE`$~@@0 +u !qpQ#oI`\H]CR[J8 #ŕZq,A+! Qx J!(FM3 iDZ& ӘƲdRBDrR2Db{Nq+&C> A8(A%h`P *W4#s 8G8K q} (z@gF{%I}(%id %){A1`뉁)X-`FS@x<调9iZXA0B<,Q>Fm,>/=>Xd _ߟkܷz7=}o>N61G_}_[8K6nlj_e>vͪ$8;g;LvV/8YC4/AʿX>jqۤ^WG&I][oH+^v} z`4ݶ6R.88TMvBQqLdbWf=[^䁗1fJQEp*7p|ۺ%{;}@S$#) &8 ='7:S B zn{m-搒t8nϽX;Ӫsusϫkc:{Gkgub9λؑt慤e<גf ,G8vpa&r]BdžڍF]n״49\\X\5=T\ S*.&CŤ~6ˁbjrzSSkm>0\Lf FPq1!Nq1!X*.&,.&ńDD҃Ť2qJ$! hO3)$h1)*sm5:b\JG;bi.-.,.`(t[\L;X\L%?X\L;S3t}T-W&ocq#epx -\!qьIhgs!)bĵVVY_/FEqQr.fNJg V)y#d)V2OA<=.]}nVXpngdWM&jO8Q-gFX m3 1=&g8r/İf)XW~.uvb60\؈p pyY^7wlHȚS}j#7+J)@l*]/) < (eԹ9õ/Ubf`@2k5|L''v0r[ԞW=ojt4gf6 o6W(HYLB6Sjm]ňiuT:zghK7;;OI EJ4LKNOcWٻARL33M r${)ٿ_Vȅ7Teͩɱͨ^pڜjhl=wT1rُ޼y֣'_d1 "TMβe:Qf=1IA":$lmɟ.Ol՝mLseֆzR|߽u1>Ny˕% +F6-mNR srTK5PC QS Y!;B*ՒHZ"HBMxanyhro8a2 %%'oa &]ag;YdqEy}9udJINr~SLH! y%S_yME~hȁhNq ubyDubXǻuW&]Zn [r*SL|aw n Si)M[ r*S|ow#nWaLEʳZ_ޏs/o%h89&uvcFp.UnS,*KILYXQNY2'@=uo p|p>cNmvS΂?bJus3{n/ :s/!+W)N"ÖiSf=MR.`޺ykƬr ʔ~42Ve+dE!0VBwqbX hp>;x ?!08ܮG#>9}%.*^\69bLfӫsp|8+y>h N4w(͜9X(wCT860;^} )@zF>hTXA>ȍr1*15 n3SaYxc?3f1t0_f_W~dD͙(/NmBuR{"m&`~)'JS4I(t&Vq,Ե L"ӄxP-[^h 0<P[ yT)dHql, *ڇ<=F!ƛx e'^ ΅F<'! YWRx+d!B`rr rΈuΩ +C3%+f1qMg,m9[Z1*ŎShɭ54N:l<kpRdBL!wq.$^{(A".0;/Q.,GAHle!|*S15LJWg!}?̗CE%"ӫ*b>1Ƴ_I,a$6[^]2[wN'_I>+s7;ub]raRώU:"(y3Z\w<•I-%*4UcYsmU4d8b޻;8H&bcpn t`EaQ4>0? ѳѐLKN(ʝ ʄY~ؙ Z@3HNHG'֌ٻARL3NJ'i,rRPmN0\?E0ٿ_Yˆ#!QUYv䊦2Pa᾿* OꆓG/ RĝupV6 S0!+HnfkB),y`밐'V8#Xg rM9l()G™%xU9|8^܍؏0pK;U7JzcNdINQ팜6WOz< z7_47ӓAD^g&߽$~8+K#~˓$|!gdoRPL|s1h B ߐwNO)9D%,틙۫͟cτ襅+jQ35"+ֻwzqCx xaJcD 93¢he2_0@ gY֠lj@eϻ2 E¤>פ8NjۙB 2k^@:p |13U>d|L''ܶ.GeIy)bvyq'/NJ̰y$ӌFĊYiF{kXCWDCT3"*(h'4 bDL*Gʀ vV/tN.+*{n㚎#xr̼uDμp$-@KÕ@H90%nW %BU8y4gX9ƬPuy?k0PPB^$V/GK |: NJ2lY.)e8sBr\[ [!BYƽeqcI;W+>MW77)1V;eNѭAsFgt4S_u312bLa<ŵsް,ՍX]x(2Bq g<9qTC*E!9 4o'9z\`=7yW~:Ͽ$#8Hprbd/c-ry 9xSz ̄@d%NH0D(ޫc-ڱ@YC'5)Y>ԙu! mTHZ5hnK&QeCrWH8, 4tٙzZȚ=iT/i=p&ܨQԄEjEyS. Ch¤fbsD+<1BWژ6ɄJz[ʕ4'VYǣcr E]sQiFC,Zh akk<ms!2f#M 6a\8/iu`(%Fd҃|T]bhI\;8lǏEt.oDڀ|@#{@ӣCLFObZ! L6K. Gǫz 'SaE:&5hfP8h_! s+O{?ykirNwisR;SH9e(謹^?e.(ϋYљ`Z3ZmJ+߁Pwd5gA>16|qi˺z '/â׬Lݒй祤_a0E >*We=vxJShj;NL$ gnItVd`i\Ko'  0_V|φV[@)U!Dԅ  A*'8;ۧd9Sf5pA,8]ZaqQRCIHF kbqGBȜ>lwJO':lh CvсhQ-N $OC. ƖmOd9ޑ}qX~}_i'<d **%F>ꠉB+ &[%9 g%a B~am]ֵp#eUŽ4\/OߪD_DWN0V,P*>,ZԓTa Pz5z1oMsi ͘rn a#C퇿_f@Wb]l(ښk+fj׹å=s}5E\cj+A9{dˑծ4#BC=pkcSQ;R7 0 }>3c=X'X9jkd(*lfup FBP+ iL,E}'8uMrnԃLmd,ذ=3By#6bWks6R}!JF^R<@R`⳼Ӳ@:7!p!3kƜ 򴃼p]mCzz)?^LI(6T)GCxQ1D D L f @fwjs.k={jlp0W6L(@Z0֭D 0fn)}DCjZK_UߣƸԮL#{r&67yd6DץuNJLŜprbz6PI4D=fD}BS{_=ޗ)Le /wݫEgaBF$ seJ"ERJ)"\xëqyWN^Od&Q_Tp~j͜9wlv lHFpyԼms0'= jVjQjSP:X)#*ǥuD0+DFE&wo]R'~+v{6rF^d0%eFFΕa,C/g{%1Dْ2ܶ!݉} ɾ6'spkׅpr73(C7dX=õ_4񱋆񧴨?iMd^ nJޟB2Xش&:TFA Z082儗)'L9e=''$FemI+lAvm$LB zK-%ޣe2v}Pġw2朳b݇96>bd 2]eT&?Ä~Y\2 e X'ვ{iM)*$81*Ca>b}hY3hΒF=p֭ T)nb4REEK- E] F4DAP@JVДN3nF?wCx/'1hh͋ G4<0<&E|g0A@ǒ5d-)M}K.ER͑薔7\2g5ARPÁ ~9 /ztr&(e>&Ezg9JCV] 3oDdSfFshRC qj2)MUˠҚjF@ʩA pTAA+ " -EMk*&Y]ԙ"q A){gPqARGpE_j$Ōr/[ə6c<ь494%bL>&fst444ZaLHPPeHz-A5Z͙BF@tPNVi#LNEJ mқv*y۰D &Sw*F~7!& xl$:zCVF09(6;j3|F BUmV|D¤YeF@ݲRDSlicT,{ i0lF0Mo L>yfmӚaKa9>7ڋDevVoa0Eƺ+ԠRufy[өÿOn- ĝuY6w>)%J#dtzl+zR4Fafˇz25'/H_X_bbvwO\$o[4w=چ^XKC&_KIkxǜ?YgO S,4?\L^["DA 1 4E 1r_LW~$oaH<"s64;XL,<Tt I\+ F( N=Vrͧ+ׁlN+xb4"7 /C<ܭzusSa !*@C0wdg?B\) u\- Ϙ߇ތ87z6*qtsuI]vvd_gÇ9PJ=+fBo4 }=y1/3`?q$3w%!kWAS!7Z#Rv|_m-ڛ+Si X,( hZJl VEʒsJ\gz҇ ev&4?\+i:NszT Hψe:JZRX)eHeJRzŘ?#I_1 }0Vb  Fu=@h]n `+ˬ<1Dz2l9kzlz<5̊'~'mPJTh5d 3Bd=- -c\ϻ̟͚&FLPY0DQaX0@gb&2fm3=X {`]kS9/ҏFIi D!Or7)e[81wY;WY$A1A~^MZHmc?QUj%eg߅DHݿ5 QHU"jtbBtZMѦ*1j`+fX¬ N8i” D0k5H.iHVHqmײ/ }l(P9B=JPU{$$v!L9*-҆2 A),VRjպY'b\`b4aQ*!qPL sP\s5y%hG!DDzFEiC,Zz+3No%SZRqJ%RD 9;%-pVvYO`5k?V`c*aPXc)7$P+ 05A`3WFRy㥑2CY`SuGL(S4Vr`}{ #@wJ+Jf(a%E;v0ecl41qڸ^?gt!pS.b75Kx^g3z0EBwU*WOm7Vy ' so4u/}Y*EŖ1R)hU !_p?ps&I-qo`g4LzwSD[;4imHohgMB{G{SZ#޻bJ_f h y3f~ 㣟փm- +DGt1}*i~*l/Us~5w [,O:qqqjao|0hWφX^qQ3\Z@3\rz !p+({k娗s7beI5ңEz uP Heqix%?e, ZIBm1ohѳ\O)i͙рf8f%(T IyIIliC)"H'SIvAU_J!K@xöQRzZ+ n6D/( 4i055[XϿbW1N\Ll/;N%um4pYl'潴l btᄅmOܸU_QS/w?:{<bV[/]8kos5|=Ͼ*if]ݜ,{cRkOxc{Ft!_STmK{ T駷kDJXP. 2}(31+Vb-a ,sj>(/$[[u[`(NAusGA(#J0o$t$[vg@r8Vjc%q*i-+)LIIWC36&Ǿ%NP)$[|Q .,zڭ:TM7q1LUZ7pnm~PJ*>yő`ra5KibkCJE#{ڸK(,Or)z|j'fPɳ֚$=-v'!(фB¤Y@GrPd"LFWA-$J$jw:iܭ~8Qp)#zFh:G IyL>wOBF }]ɂ҃z߽Hc3CEHm,uoޛg]#U%F~ejm;o`Bc%("ci6 4 HD\qB  !wxL/} _FOww]alΒ.;+Yr,6, U\N@FKԪld@2t |B #H"jd, $#9p4!kRMc0T:K,n /lj.%R}1H],j4%|B EC/}5[Ln9 P1d>.ѕz̹*c\[*"45LR-;lI1bX_FiLbEkZUV[֤=;hn[Q8ևP>O;be L'cBX~&h_Hx7Ss;{Y%tҁywWEfwA:g DO6{B=8,jQ]ddd%8ivH wuO`|>:̀R0'9??U:dRƻbˋf^RcF|]7{x`ca*@}Һ8a,sitEϊw?\Pk6n#m)gی̓"iS>\YiQP`吪҂RZ[R8r"bϡ@}L7 y]vkY. $CkԬETPJW(ھdR)tE<{ڗt({~egu߳zBt,Bח4WS2[d>Eb7z-Q#"`^8eÇ٧u_K%il )=<ŘnZ,bBQ?rHw&Gr`h~ަ-kwڧ+>}RYW`}E)/Zqcl](wX#SFoFe@Sw{'2x9 Qvv'I0񹉏PF(ȝv6Pr->QLNlwȊ2%#> W]MF+?+BГVxPL"nQ ѷ/FQq늣+4IUz3r6݁yzX oA]5]5t_`"8ǹ/Ԩ -`GMIrN*1" {mL yF\ j6\/<j4/[)'uyaw~­` "}=RRPn]V_G[}muM[Tod4EFZ.)h Z J٠pH)r3|S޹ew>] ʔ\= Z{,2)Tsu n&JjE'%[[[HDXhCk_b=cEW OCo,iaJ?=d1a>x$5o`ӓ~Nѣbχw4Nin ҽAnaẛT;X˄&Sb= noc=ahfV}oBiԼuû QCwT$ h'\{ g yAvUہH,޻} ,al_}nyJ6[ U |/؛BxL馑qiU %йnnK~}˂v)G6*X=RV iPE{J*r2Ζ{n,S(L|HyqiP+k1qD3[*6 ps2-H_OjŐ}2x,'el,,U WpΕ  `oY&`9 gi κwb_65&cX u |s3aYĤ40 @%fZ1,f xZ o˖_'a8Kdj3J,^"Zqfs}ms,sQ('2rC 8y15A`=)n%-h+HKYA,aTA)96 )`$80j2`ܔ,z Cԍ}Ŗqa@ɭ؁/Oz.s,q(0 ǷOyP jiz›бy)wCBjԜG_oE,\*՜5R&74^#yK~:-̃Coћ_'V2Yl'\D#-<0ec[D+|Mmb CEWngWfu5[T r2O{kjYt2 Qֻ4 Wkͺ [.);Fv5Lp-Tօ|*S )mźiק`r1HQ1X㝼AQޫohuBCp)DTZkhH, u07Q?>&Ӈwi(F[50\\~2˻r>e?)r>JT|n9ULf[}KqRٴqU%UHEHn,Ծ]Fzu>Le, "iLcdo%!!&6EsZP>P`A#]c.Ny?H[/(LH;Bw$zf<79+X;l,N{RgD {%zgvJNέ;h`cj:{M%+B0`Ѣk^x,-񜂴FQp5Ù V|҅g$#F}?0HBQs}1,); ZO$x9x3E'S7.p$$)0ž҆RR8+J2[ D ^FuNINgKIbJPA`.`D ц)x ܖ@fWJ9pK fA -Xk5X#,ikip'.ps9iqpj.vMF'ح$JJP̀ P6iXwPηb@z=Bh}f=le]s'DZL`!ڻŢO77J-n\N0EufcWPm ߸ǧ6ľ]%VEW˃T}2x "Y1Mcu`<YzEV`#L K4熲Ҹ쎵IyöXk\z9ΊxM!841BplH ViiSFNy>ǻD8$ jN\JцY,3z< UPgz_)d ـv}Yx``]宣GUm=`JJ]ULe- %%3E0.dYa@'3ґ7]k^?|0ks YȽJ'Ndt@ "n&UanRo4a1aq>ʃEǂ: ˁ'}D`":c61Vz Iͼ rFS Z*^ V$ ,&idl!3D`\$wusW=Jɠ>XT:L% *>NmHWZ n5k>`⭨!x(;B1ut jf K)κelg/-^voWKr W- }Ne݀%Yg7_~|^ݸ/{I6>81.}uQ:vJ{ҹv]#8\}ݠQ1%vwX:Py^)Et;U:vaG U{}F704\~ܟjm5c/XB)ǩ;jIE#kTJe) lRUv\U0¾D ([DHغF& -+OQn2+Z\źVsh;amzIlagE[;2K0e)la WRWȒVSh_}׈Ҍs-Iу`d\XVw7߼).q!S33T40#iUZ|Gd];HbvFN>iQn~lL=r=33)mF5Y+6M팃 m|yK^~OC'eOG Đ͟䘓:a50mּuk`;2: VL}䰒I~pc@j/'+#s-Ԍ챠nubvQs^N4Ik?XZin]LMG:Jf2Uq1%"+^.Qr!2 XraGK< \cΐcKb`er+ 3x7`l:^Nxӎ͡N;10'O‚ich$.؃?+Z_ιTA;ui-Ȯ&VCg:[sCMX7SlQ:"0dRs-m̪ UHuL7NVj]ғS>j=0Hj֨ Ibf#(O[!G%gL++QP[([%AiN[ k5_[ rIƅO {b2Q`,ku#Yۯ4QZ٩-v)YL&ՁTPo;҅`T)8匽h'ժ{ns?\#&xWw.]C4|д+s]]׉/:w3/⍜ox#mod1B]N71&m3@˴K) iI|M:t8)3̠,T!r޸,@ #i U\o.J|ܽ7 9{ZuU}J^[c8_\;/oBd߄d_-Ӊ{z?[Pƀܵ߉w/]3w$|NDolF1%i3ꊴo[>i1~^?~VA<a5qk#`̹1~%[TL)Fђ'FzL5`Db]Z'k Yϲ#TKGDyE;B mL  ;xitu[uH\/%3A2Y똜#sS, U٫9)iy%9 utnOoe2Y%2ǹ;i'U:9dd89;ZG2X Gಢ : 1bP`P9Gm(8*2G1XȜLqm_4?BrbSdSeR)PB dj[Gv4}O1YdQ+! !0IꐕM~v$QcwNO/,Z7݅ۏiYL׷,ja K$Bf/SIgJ֦h`"B8)mA=eҭ{{8i?4Ix ɾQ#Z7,2RntdY8Z2)A1tbzËs9QG7L`HFǸ S0K.Xs Y>Y˷SB)hiE8%gK=f#ʆZd8pzz[m(HEhKmIQ,;L[Ƹfk/avaR86 {w #ƔF}51#MZ%f`83{%>\$wu@0^0/Ane `j9)羇2j V('~T&TŀSPxjF:rJr q׽&~rna2?x;fMLM N(GihJxRJaq 4Ͼ+J _gyv{MTwl#7O/.}#!M}­0rt›2!V!ZII,Bf!/ȋ~uЕ L` Rۛ(A$9NT4rCBٰ$cOc:7]ֹCj&E. GHSjF.t^𕂯WiXULԓ,Ekz{ͪ )FC$ohr=؈U ٓY[BFX6"mT*֥q-vXl[ܫ*JO^rFEmqLbx b "!G#Ir Xn#eIubD>n?Ñm jv$}Y9br 8a2AUaZYNjtv*2"`JLTdSIq}dB<lIKrL_;42ݟߍhr*LMgO.~Șx|OCouN"Ov-?b4/b7oWHaH!8fy4:D;g˖7_}O<ڏGڧ]ƉOً7+58q2$SOHm-{Nal<{=(у-~bG*2+˻^A0ܾa|iiXF粲({QrDA hɆ2Y+&x!pq#cDR; Ȅ*Y!ӡ"&LLd̃wm~Ywd 9ɾ`&ICdՒlR}|IKi.,V>b[y7d kaHBPsMI^c/D  Ԁz5N;Lx],pt\E);FXZZ=J{"7O;>@(avx<1)<L ~<Y΀T<D~G>wDϕn[g*`»Oс|5hp50qht\`zr<^'jvZn82Ha̍ 5?]|f  υ 'iFL˲^Ʋd\t=Ie#ƛL /C'=Me#EȖ|C=9Yhx2\Zp^BaYd ֵ .bịYAw̖߇UwDžMbu!@5 e=5QYcl2 /3(Il4b[̮l0I+w4Ib0(if,o+Js|XiO)9وrUђr=Z(b\S|MOM" CiZE]:ântXw98FvV;#R'?J0G*D5+Tך0f]&J|i2i@vQУER',_, kK`!+HFmn3]HY,P!WpG扚ӭ@ iZ|0eO.eob'<ײ`_&ڪh%S]>;&C:QC-oU~7ڏdiU|@h%j#OOPqrM#uLUdO hޯi:fø:t:*jvNhד L`$3 /QTȌp#Nxo=#O1 81tkjO oZJV1ٻ)|bRD FY8猝Oc sI:jSB餕Jj] @E̵EA93 c4XԡR"8o҄(-XДtnQByfns*Ɵ3>mtj06lqm=y/ƞH;: fwۻ廋zry{9CK)Pb,`inOeISo?~FJI¬ony; kUՊ/&3aV< ǓC5v}p_7ti2y˗ > Eg4g(r!!.VaqwUq{{/n_%{_Px8ƻrnˢy4T%0*UҥAmOL}7 yZeJŜt}fR$"s #%R'+Cbt!$SPT ?dj9 DuΥvaMLGn Kc0ʉBĶX$FSf 9 w^ $IB%r 9iK y! +ϭ ;6wYdR0.:k&\VxXD ̕.5P1Q0A䠌&;\,8`j)9(•;MO';7abzq< 9D ^' WrE2<3%)TAM8|bc%I yRqY( `XW %s+#=eެZlؓm=Fdٹj{*Yz^M"8'W'Ʉ *xA`Tz{6N&xyzhaUa\xZH{/3u 3M ٗ_6twlg"8oghA}3֤ډfWGM|w}=r>-84enU?7b~l\IK)/\P9w{`0\g#IH8R$7Q.c<`>*Op{WD܎b1n&7:FFQ/?z1RaB" j>Ў^ڀ.peq`9ުm!K4(} d {?|rXKoy7|WA?~^ f>e"m:-B[:gy?YInABp˼᥅i8#KO?=?RD|geiM\v w91W/rU}mWhPWpdhbC0ŵAC|iVFhoe)0)b;Me#Fe0Nr5Iڇs>dנX. &PW7b@HHǘ4߾#DKV墴j;/x˥tS+r;1׺UK$eJ *."gЈS#Ş_yny+!#DaȭF\@ˎp"BH`ȫG!x:DY+K ֛+ W_}D0&a@ל%oQ+8RJ'Q9azCqDqJe8$ !7/ ab *VKmBsA#t%+- -l@;V wZ|mhegr(s p)!AC{_i xi,Py"ccvD2ﲊRN4^@(e/ iHa*wttHq<)^&?;~H?8 ۻx&_V>-n]_pn4Z-I qwXR.JbWd/$3m%QZ v*4Fюi[ bAC̟[#cEg+Wvr ZSыJ$aRX}J=@=3~̇$!CR3xWFq'_g S<:C}-A;oQsNit# lu0"+}jucdž>PMY`eۇ v)&L=ߟ>_O?a; Z˙컉ӡ\/9pC"S0!ߗ3,ɔ)Tho/B諞&|zx0yh=v]x$Ǜ y ڠm嶺?$`BP=]RBZa:mIp\ɛഏ&@E+)ڳQIe>=*uM4xZI(ucKlBUYO>{w 0㣫7ľՉ}/oeDutd>fQ㚆K2.cd/[Pc"ī=.H~Lݫ/s$J7% _@ "B/,]dVsc?W0 WfY+>Btv>]Zig_&KݼoX Zn/և 5xCc]oͪ?+kz7l}Ydfv7j>5A趩c~#clJ.-ʯ7XM 2ԋ,;+7F1k 4y0~[]mXiO-+\!\ET\]F;֔1tv:+ȴҙֵuk iАWMtzѹz֭ lzJ߭!6Ɔc$n'"@Yn-h+WuJI9\R6%G\ӇSl˘ *-㏔2(ބG=jo&ᏥJ٪qkX ˇ&b买TՍ`gַ|ҋ<< {DRI, D=3L 0ӓzf#)T=q-OTQOm6:) T;DӟͿU%C#`JOk88BW@[QjE" UڸaEI3ބ"5mw4=|ir)7;c qԛAKw;Pz6&R* x'J%4PvAIkk#KB/;ٲrf /x iat(%(Ӕ($S ȝ HlG]w\΁ si/ IqMDQ{Xc{<4Ϋylf\䦗)j?Jt@RDfEIX{dm:o&-^`EpNoC}4*K1̿^o"6 O[-v͹zbQ)}pɛΫ!)d~{"vJ+]pi敹jyc85rU^Ypsz^YԹEA&G B!({ Q! nq,7f94Lb~!dTWQY9mdZYy0w0 O~e2^Ew ڡf;1}@ 1:Y}MMrgVV5:X)MtRuUpG,.1'fd[z ۼK'fCpށfb`1hh 8%kli7[x=$b~Yݧ\2ԓҠKAqԃޯ@qW7o[AGrk%Z/Χ4 ňZw)؈\&F5:졷h 2*шfOql2DwzqX q\j’p4~,/3H%_;5Cf;\lm.&Tg?3;G_.?!F;Gm: 4buD{aYda,Sgfr̾;X`F SE?N),GZdT|g[9Zq~=ӪPrf bE?x1sn& A:s6]:;Qʘ!BǸV劣}ƙyR}~+j7ٙmq[z{H*]6&s'9ȧ|wbrLgN#L2uWVFQ g=#Ρ)!7kl  6jo4=O3$szAsZPaPyxZ8i}@~-=<m`e=q°\'l%Ah%z0cTP.9PaXcKp/WAqwCq4;18@*r 9=\&mgdT][hAxtA0d4Ԉ:V\rczgy}rH?.gUp{4ڐS $ %܉8^!V?_{Ի5:Ckw=$sZs[LzNS"OһfTRw󴡦qn~6{d1LpØΞ6͜LvZEZ4< " !PC *d[כu6p4$T+ 0 ſ&珓(wYLɟ—;p_\|wwS^g`5v_O@z{.lQq2\sZ. ƴE3u +ӹyCݬ>rhK -Znݼz'|u6 XnB5妨!0u=,7&2'P0Poe4<6zrzJC`Yλ,݁%t)l%E;bRs{;;֥DXP1Fh(ȝ^7_cki {tct\n.;Aܩ*M~s:RaxL Ɏ5( aP[^Hpޞ=H0$MUk<657m {}baZ_X:FxȕwOvlPP\6~Ux ?x,oVp5@4FyGpsOK9JA{J${=v9|*[Kr=dk! #mӌe;Sc!oTV@\~ciAo#zOIĀ6I@ctnOc -GOW`qnDn+{*܄A` ,w&ZEWn(*-(oe|j{:qnY*b)B$˺[i-^bɥaUB혧͘#iϖ *"D@[ A"Uxek<$Ymf_>X ZS[Xa^aA>+|,d/| }2H=-NDv`Fɼp|}'ywτP2C1G:{ѺR)i5pOw5͔@0?c#_N.y'a`B68I2o8#V_*Ou|L|/0<'w^9eNį ?Da:`@SE7KкQ&#ᰙn[b;; j۬ڂq0ݖڢcH##)elYAӯC΁bBP)sUۼ\M8hY#eK[ zwXTҳm] 6 :_9@K;)Ck--vZNEK;Ghim[ .vm~߮б}дT7:vKk/O갠ǭ؍9\9 J=5ew L8C f1J8ĸ@򖺔mpDā3Jv:̔el}~ ꡆnGta0T::.Z2q:zElSns?PÀ Dƥc*<@CCWcC/\1-h.H[=D&F&{-N`յYIT+ڈyvkWj 0i=7jsOK{,&;*qx詻~rvރLh)Řiܒ_a$7띸lN6EI׏e@'ٌg[g>>/Kq%e1LeO*>1ѫHS "4JIuؖ*ƃ0TبXNq2jOSh~pnLlj&f ]uZ|Y֓qf5P kN֧l}:sK1S9|XJ i1lǧz@k=.n=8*$˳r-D7'!}=\V}Ԉzz.ysZ+yEeH)z@% FG2hp祐?Om[.S6QkL21!n&]CtsWU*% E;"-<|=kh- ~8q$q"iq5~ UyV(Qq!Xf iEWJ(+]E䠏9>nkߏ(P0[)- ck\gw\sD+BAW<ʡ)g!E*‹9a+ܺ$b(]{s۶*9c&x?4w뤞&sxu*K:)ɔ1)J\$Ebb7.݀51i-<,ᚑ 2M+Ҡ3<U8X D<* wo XI qjXa \xbsoM!DFL #h7-z>vL"xn7/8!\!4k "qGYacژu!N?|]{|%d~C/I;EH-X8QܐX0k JЦ㍜έ:r-+C݀D%$.%q,KfIqɬT]d#Ui?x]-+b5~qYF8![$T*: \d}/NDlk…,_"!ճY֠B]` %Vn;y'Zf @L55{qbxFN@2/@LZ^Qb5̄X[5hКڍ&8R%A(UxR`d-5 wHpU"aIB&m*0՚%V ^Hi~89ww[+&ņ C@jO jf \(x&뼶6S͂O.RZ[+u_;Ҙ#߹_"QZ2&nD6y5[De*֛g.eJS5.3-w|jUn+Ms:"`ֆD&ܩ)L2 Ì4Kc0pk-8q@A7~ۿ`Nw9IOyFswTds48A}v?uw t_3wos}wpqCpmgӻgWd_{W{YS^mGc/x3mߜԜ}hu<gzus=6{eۋd3_NY &4x>|Zv>NH@DTί;d9:1lwog?񧽽'>}{xsp8nO?\tGz}ܛwWnxn]]peu:kx=3ogl0whTt}֏ݾ]cWgkӒ)%X6L,NiمSZe^Vy: t߻l.fv{&oWkG6^o_z ]^(p蝼s[r lH@U-W5㯲Tɨ]r7yg::N>lUNF>UW -Cw ӎeo)摓2>pKIw`MoaܙJm7@).gW6wr(f8R'z#}ݻk.:Kzl{kx\[w8{;UNpV/ -b&M$)@0`Dg r"NQ(!WhT 3NFkbd68%@)68Ál߽ h) d% 1Ucvu7[`Px9ZNPc˼q 1K9[fZc7`/ g l`0؀EpQLD,aDqtLcB3?ʀ @ E@`7ltC6ay#?od7&Y<7ny;bĝ/k6ܟ K5|c)ώU \gl(P=7/ {_O958*]S:Վs X|RfQ]smV+ÝV„f4(Kݸ/ɥJ%7.uR7.uR7.\ER)?$.L??z33|]f06>@F4֬Są*{Mέ$Oj=Wx}U9-$~q5x‘bw\J.ZRZB*- S@J% hS_2МctճTN(aps aYu ǐ*i[l 2ަB11K2[ y*CԧY[90WIV n$\&6o qYdX,Q2R5(4ZkgQ,9".heZsFB09QX(<%Njl]E!Q5L*=B0\[=`VU x8`vPC g ,J>Km/g2-a,cpȽ-Rd<<  eRc3 Mvn*{E ȏ0,{!DlBVbSxSO~>&h%g47Hhe)I $WC) TSԁ,؀̻QiN,Q*F" @N*8P^y8B&Ap4j~XNFAx|zRԙh L +S2ZzPi00;a Y %Ft%HyN+6&bc+6&bcRXʺ֬qX*ػH |2 % 3+P( 1ʂ8q#) PHJVXHU@JVwğ"~guikRtk 1jjRshZ+EX+EbQRfZ `JV7bժ.oZ,Pz)J1yTwNjszP](.re|Fc%5Nγvrn)XfV\Aj6fsyLJ2^bg)\gzźѬԋ+֭3%[k$Sk{쵍X~F9n_+ +"m/vn x}/W깐r}/W_iTsM1g n_n//$w~9PIUZh$m/W 1F1JcƍlMz~j_D-g-A |쒧b2 }6”hU!) s}'?{BEdiOldћ2{`ISTV*w;`:ՅeV?ho]|OkzjpJ'%/ zg,n h+ț~k|gsh9*8wqGyC&5*0=nӻl~ϯCf+ûnmm-qHFkޑGŻqe#$Cx uX޼Gf5#Ex2 :MQ3> *34#yf[ٶZ&YE0h6s!43 h8" 6HaSa\cmL[E9\%OQl xw;16g^vf^]s-Texsh6@doV/_9'"b:~YQeP\a1"=oqm(4[_h%Cp[Y^׾4&.}3ip1#͉'9ѢK.m0ID~.'Z*ѥ O^ȉX'H[+FLផz՜hY@ЇTb&tn=Bk*9B:Wf\x!sE`kCR9Gckї;W^ #M՚^U@0̈5g;hz>QzNˍS.zR `ϒ,u:|*޾й\>)R{TBn9ڄM^u=5딏Na&pǞ:e5b/mьoUDq`"B%B$J+ەeD 3Sm1XSf06TBP vYmc2 Bjhc^3pnWky"H W\BWA\ KS Vа q<^sPf `jf}R%Ůŋ]O(ʵC$,-"Oj=g|nx=g "5=d'4@1m[FI| 'FC?qr1Ón0ހE{_cuvF3ݪ4F#E\IDHPLbSn$J4Z1- 3$5{b D̅81t19swmܽ~Wpl6؆"1~JRLRv>$Q#n9陪>]u bL oJ$q;kaL;@Cߞ̞[U@$1L:Uu\rV]>U7qî!9H˄+,a@ިnB7gſF׳?^==9O߿zf~ O@sw ,9>zsGO~hv 5gWG/8zG=۷?̮ɮ?N^ ngGq֏OeoT:~݃Ѿuo<06'>+R!7w~ 誜YwЇnÔeDm]`c~ui-jQN{֖R 2vпNFi>ԭs1g^ddٷ6M+O.}}լήٯG^F紸S_ ]V?/k{2r]ڗ߿{&frӡW=פSϯ/Iw$1r]zM|ٻa'nRnݳS&=Dqܥr=,r8.z8!wrY3Xa?✟o.G_Ǡ䋑 n3G(z}wTmZ{)\~ߺ_ Zoa! Ʃ wX:p]8?{oO#|χ=~F!)NϷ5_ypp<N>gOhi i?8tm //Pۯpm/OaLO/.`n2dgֽi>;VImvFpfZޘrs7\^LTC_nǯkSW 2{n{7YcC<_nY Pv;߃x*7erGb»;GK'TurJ.uY'Oظkk7+c{u﫻+F[8ڐ KAxEH ΍J*\Y/B ʵry b5T0ml7R^}ZVOiUT'WT1aE2bVL+`2g<3du'ʫ֛dJYu^ep_KB7`w[Dz:ýќT7}WUo #^`7v#1QRU0vpNb|T:rY(k.2M@&dib"S %Lh#`mbc, V"{WYq[\ēr/:F"tqJ.G Ar4(' `n_>pZ껽\$ي9Knb٨,sq%А"B(\46k!+_vV"UdmD:Z9 'd|,$ mRF+RS#0\m*K2al"2g<+\rm.ۄx"܄)FZiTZ32pxv{3emTL2 ԜKk6f8L*gmh0ڥ2Vʃ=@ pvP5x9kL(+\u,-{"iu D&gc\s66d,bfXJjRHFrv1t?麞Eet:,1e,A_\W@YΊ<mlp `i0,wv9 Fʃ HE6 {v-v lK@"eN7 Y XEj ܅vպtrj%hbv^\y$kDf>tzI+g0pjZ.#ঠBY(.f߷sμtw>ћ:pEt~;L.Oܪ.2+|^{ߙ3&&*(q)Sj+?W(YʨQ/%,E]gQiа)}뚛 !琅UʂV[Y|dR\,gJci{-8/iQ.f0՚C%TAĹ,LIX۬TT@X tj^zFj?`%xbA2lUpRѨc~Z 1Rd?Y 4Yy;4QIMl׀UtϽi}uM!9-Jݵ/+qN؃1~ZP|ž@fKSޛ^3Q%!ٛ<t nO0mzwW;k4*tJTeU2 Qxr!dB}Ȗgԗfgâ2,E.W|jy ۩1r&矪пOnIXT}i?/I'Jpd $VI]YA` i\)[Jl4C 8h ʐHKT4rҒQ):`ab vI%7jiS6BӪtYF&+S)&b儡IW'TQkM}( J v+nCcc"$bHRR j_u$, CYQKfj|bʐT\9|dP0NQyR4Vys"IYގ&#mr$,:56$NLS*FbGM+m~4u*YbIbptr}$'42C`N7|ɳh@)=ioǒE= A}x@6@KLRvIֈ=DDɜ:t>Cn;cl֮7zi* Md EpeI𹷁 3'qMȜVRsrMDql. ҦNԮ}1ZOXIn N ڸRAn^X0CfyBo}V%Э*9yXI:rF#*=JNh(y!ˋ*A``bNг^^tK`'i^GxK B9dtd 4֏ 02vWQ_U?*E_YΑXTAn*=Ԓ97&mVj2 ID Fֺ_X7&Z+K fc%6U$ {$=BIX˒P*伃H"8"Ob@#4K*ֶHDih3MT JNW yq)m#h r1h/mJgIg:cb1#E%FLmKh/?4&<: m,A)U%]Pn(`1o[K#q5J%I*6#핉 V .V H+e NѽtJ6Bp zRJFXl}{%AjRu{=uj$zZ/Jr`XgJX < (IC,ilCcRYRhzcR% c$,W%o̵K%((qm[NƇcEHo/Uj7ӛr~W58돷)Wm[,ΦXj(-'L~˟o/w=(J"?*H=#+G}/bVʚy6aFd_~0`krzUaP>X7XEKөiD F/nⷋM<NyhwIg2TYQ@QH9m+tE>'_m(Alrӕ#0hi4q{2N. >lܓO4& hs<&OYz-+EdYZrw:eo饈bnWq/2G7܄Kr.M ,+%,/-1.ϥ+xoiYUTc\_)"iv70Ϫ.{]w_<*;C"VoCe΋姸2; _)0SJc8 Zx~E)&V"?Z`>·X|>W&bhs@Pf"X0s3y)x` ʃ׍vOafsR^ls e S~?WгwfVY*sxY4/77uA;'1HD&)+Vⷽq₆S+@uT,?HQ%]J$tI*鲮+0*0"\EAK /)e, h4E/bw'GJ]_)RGAv)+2 Xڊn]mmEa $EW$\nPw.-NV~3>+;)2B戠rLєV2V`4iiُL̡0ޕ+X#udUp3UG%E D;ZR`6DYH{N#Q}XbiVlRc߇/h G O17z@`]ULcɎϤuTauLa,CM+jn <ᡚE]"c)IS7 })Vn/ܢN\) #ׅ,,jUW*RВU;c5ԆnSL/[d$9J,sȴAr B7`YP99I f(ֲNcRcڵ>n# G!dsshf7J7.K\:BfɃ&Uцɐ4õrys$4{(BD1H!0ce,G l:G]$rFȽ7Zhu:F{JR!m90b>Ko;,Z'+Jb5BgdgX <)۵?MEƇB+A,cbժ{W$3C j\dnmmQ:I߂Eb=uB&T!"긕a I9%Tk|$oXjjE;ӇkQj -c[;,@Ni&i7KJDiYJT6]scIsE<_IBӥɝ#5htwVK&&'zE(’y;èqm1p_ڷfq/dg*E@>UQ=]}i p} *+2OT>mAS[ru*t&r#EeFe4.P]˚A&ሗi{bZOqy/h.uue?NYy'm<SG%ܡO Ӊ%ps 6sE*hS*Uw1HEv3t¸j Qxcw콜aToq>_uVԇ.B2k[|6[v7d¹|00REcCt_JVs6Z}% wW:ja$?;mCG;G@5vwO?o\öf*j=s߾sрۂ d ,*_"ؐ}cEbST>8;Ctyi] J)2]IhwnCWkHyNN +@XQE#C?1=1I|vbN]|}NyIdqM4F 6avU (l{;͐5l剭\-.7]yO$oԾ϶n`t-è)l7h߅.,Y(kd4"PAM!Ԫ u'3 ǎJ;"ÓbpWt-?)"lL̆w6GDgseK Ay ǝW޵]M*|+}BN0Y]iP i@L$ F;̮oE'i N2VQ;fBhp,,)C*W,z)=ˬڕGWdѨWS^[vQVs"+5]b:&xO>VHaAUyV\QK~5&IH#IIХZ¡#Jh8?r1it0LMp P\,cZ溤4%.)uYOsqeY"Nvi:pys(}9:-UړLNYfeT*\mJYWUDOy^ǫ+0)cRz<[4\n{-5VDf~@lB̈́%Ҏ);=.qHtx%i~1.#WG4v2j-xnDa*5J&+dyTUf9;˫P:z2Ž?󱇲+/9QIp%,uf;N=ngS;+5?QmpW3_e x12!s'm^˶XϾֳ'"+t ;Lf;j9+Mb&oNbeUț追2xPv+冸-~@;}|u pǛ)wZGJ{3>‰)}Uݟ&kaGD9]1GLW&Xh-Q;Q{0Pu!m0q05o͏Z}46/>Ԗڇ#IkachV}aΏ]%'J,(_r]ap$]/;٧GwRCH ~ F 7x\q 5ҪPDZ2j(;L+۴zvUSV=WeE<}G[ n9ւHwT,][N=meJPM1aY{]1B 2͜Oq۰ m7`|j^,vEOn^nZa]v'jmcY+q$NOps i\@rI,)l~gI=(Y!bD|ٝvwtA[au<("˳*"5ǃ'Y)'&E> f\x~Ð #dդ^}2njt7e^]]3ر9J|QrؘA)!0`,VÀ9Lk@&RJFi)yؔaB+&sG~ NP7pQ|<G)TjDVp\4-5@dUT6dބ7%"x;ZU? ϤeE-l6ͩ*@n^aL&xn=M83Z̐&6 SCJV椺!uBևنFL]`t،QQz1oßym_U ?)3}3EN,kJ#A>Hfɪ]\ 4y''j:-$s4S`"1X$ni~i9B3ӢwSJEDלx@K0z'C!XRƸo >iH+X :ytSsުCQb= /5USpێG@y6*{=? #1I "3RA2+jIխ>TPB.LaZDlPUL᥼p8C#/La}"ɜR3 I*Kr*-GYr̓-oL'6a7ЕyEo&|oΣjB+ 5BcM_gM ƭ9FXju6s)M}mѵa궁6y"oڶ)&l]J,F43\[21pOӍ^h['X̴x@O24]v\C"HJH'fܸS? )(v[ŃPE& 7ް9SCOy I{,4xHHg hpPMB$ps|XI@ܸåTI{}v'PJ M4=6;E3v>NU܁7^Au0 igg2D~'23/P:(,c1߅gw]x{v^vlBAC/tq9Hģ(d<0H(gnT+#Rg.E3*T:fVNqB3|ro?nM[{e Udk7N֪ >#grGS)O'oNE6]~DVD*+-h0C(]gI|}S@n%-qKB31ށMKay`Pv8Bd\_< Ζ<&3~x,6wfu Zئ`-[p5ӵf \"Sr$W,zh{g^Ng$ļj6F RA،pf`Ue yq]dVWq=L3gKJ_KRWUrM UXu섓B e`)-0ˬ@S@V"hɝ *WP6}h͌|h={P&h #/ItR[ _bYę\ ?!Es5w̥yn4b F *"CS]]殖_A03:qplTT;)0!"|iF=") Jq@R:t fRH!8 Hu x+ Q@ Ѧ$k5yUN m :݋yX1oa-ۆŻ)BEdD];DdyƖ<L*lN kZخC>R5aԷbEU͑pU G!+cnAZWt/V⾆gqLH"%#TfXnlvPCtcG=:LTq }-ifCm+MQ]7k>i}kGudWUMj6#Ϊ)h;#.qc3-!e q.Yc)G0R"Wٖ5{< Drn}HkU֣E֐HJchQ\qWP?j]SlzF)%.o|лNr1.ip[@\o;"W("BV)UUzPHQ,G#F*.K+^ۥP&,ofllb4[ !Rsûӿ= =7\KK'[8 p$Ǖ+mkD(\ךm߀~e6j̪4/mŚVb>1Uɦ0bFqv=3]=L(ZC69~\G~ ߦaJic<5T ?㤒ZܕmEZz c&.}u~ vl'xHQv{x.e3buQ|8B7}~L\S\qӤ h\JH5Q3m]D:6.*HMnab -:z!LWkZ 5E:OSW']0jŔ90K_"IM/^mhVΝV$cίyNdC˔K 25rș'M{ҍZO+L(`l.%<8\D){M"_cK[8`Q<=05P3[sl9 /!Q9ʖn< St-µm8ѧ5.AZ<+8dL'1ϮGSIy_>&yY~!7i&Rg )" ikm/6)Q)'LF!d1+f:JU-5(c(zPW28302 4:_J`N"<7P,R[Z(;N0m4N}l8 Z$ϝ??z&ad0s|5Xte d4tRtq毞EUUA9.ޥ;9dþk\9 4JZI燗ûRۧ 4?Csw0tlj IBάNlY0͞:3pǮ߿/A0bV,h5wu Yf_D4:10^޼ nGP9t s. G4|*tOCn셻ok Nf?^_z~ߧ; >L?d}}Rp$U ׽}g\>'D_ P] U#^E_iίJv4w!aL̓?}L,(A]BLM?)7Z5>M EpzAa~RHJop;a3%чU4 .Hwoh͝} ߿rFVɅ;Oڊ& } sz=NKwmw: N k }/N?^ypsd4v/=!}18ܓg߹v30T3W6~&o,D. ӷpq^\DFa`r  L7򞭽 oo_G jT`^mI w/ݱYC#|uo~oyVWۮ/!$8ҫFiD'sאf| V//"ܯCwij1'~M >1<B nACx Bj0=}k}>kM/|1\&W?NATس~$f,ͷMNuiqsY;;9xgWl銣SXL$~Cf^taFbplF~Pw "XġNJZN%l@DE =@`Nni Jp7t?a\:j;3ABK1+;Ak zƝNFtڹ;?IE':g1`qxPncB*KzKcT=Wiѣy.W^8Ca5: G'gk<].k\ƁyoVѕL̾)ґ ؄ےƧ {Մ?\o_>ܕstk*;[mI#3ܙ: g ZRraNj ,5ٖ!w gfyؽ wW~~f밚!:ou.azqˇvLtN )U0[uUY4@TB#ckkd 7۠fb4 O3ԤW!prL zuLYp/M37i~w'o^tխ( W<=yN]* >t"ƕa@w)vEH0 r8%\I|+]fṕZYUU>;؁|8ف NKm*m,}Jn+.zuq؃*b*ʊbv.\*9zcF=Z* =\,)c7T.BCOvU }NXF@3WqC#2Keâ=,wafi%C _Xf_TP1ǔ =7'$AU-$x.T+0?"]o#GrW}0TWw~|H=/1~zuW(v=CRCi3YƮ"{j~U]]bsx ) {m'>h}t:&ث;8 JhX'tztg%cg>@}I22 3V&**%1 2{K"H01VNj>Aj 9N&iS& Y} } 5;M|Kc#>Klog{jye~dTV=fXKԵr@#0.i#[֫/՝@uo T=Lg FNgwݠE]?us/.BgZZcϾ^X0OW:\:ts:tv2JvQ8=zPkon-m ͜ZodɂL+4 VR9Hu̐ UN;w CG0Q%21<T(d k ULLz]SKJ3gQ]0c) #ٱ:똂q!}hI>ff21[2{SSQW.Id6< 8 ѴL<Y0"[s<>$]Bt,xAѿ]ۢv%PbһVV:9nIRnDdPdpFi1DҍĬcBp<j9x*u:1*rk\'ᢒx0`٫{2Rlc!'AYVI^3hqUUW%Z\5]ꮜG6(DZ*],wUUrW,w!GP-- z~/BtXelHNd<W$y|0yzaiI"p'HvEǯd]cC p0WwW^k4`H1Z ej*]īfj˞l.9ELPyUF\bmϑTN!]L.Iy#hIas"]J~@ER W*ٖH{Nݓ#C-\R. l9oݔyw*9kjo<*K<ܐ $vד$4ԫdQt*%JGɫfGfsQe!jYTFx3y>EH"jnS":4p1Dѹa; }#F]*02 9 )1O G$)ϝHd- iVEgZ~z?->zjqpͱjw8%CywoW_ӭyxOϹ}(bzz[',M&?YRXJw۾#{_ܾ#0 $aɍ{2L}iH@+֗0ե=_Bç;ZVrhJhR OCEW-J|̑'`2qɜzi +C(g)Šn0@)HA{8#tڒC@uI4H6ARuQRZcHsS^emBbJ2t :i2P5*{H/eJ Pd ] ٔ>MRa7U(HB!j3H[hR ԽlnȅZ YǑ)dAx)2\@Q i1EKpqI~*FoEYdofEH}I?"ipIŐ9#\IL\Zi.`I)Ĝsљ8\ZNq''$ 9ZRIz%_= &ird}C]n< "A14XF Eh]is8f p@ƙ|Vr(VR ?ŭl7Cӻz-&kz(KIF&-mVZȽ%="mB\pg}MʹAK޶E9ߐ,.m5ٖ̽Oc>4+^N8p:q(lݗ= 铹\rt;l Y7=#T&1B&y­#ry$Y/fr_0NzuKJKeڻ~v]sy9an_+Nġ1CUaت{AJnE1}COn(|^j 3FDa} @8U0RϘ"|%E%W|7pwuC$P=ܙ\}wA33b&\3:h'+\Z?=42k厡h1Th~ BSMnȎeo_5A:hn8٘e^X,7-R% bIZBd ,Z.c6m#Ư{q[sGI/gR9`z!|J8X,k_Ւbl uNihmkHڢ͂QK4R+XD oۢ3z %NiC{#B>(eՖg<Fspo49Ju\*u=6Bױ cAzI#SAҸcAz0-GVR>=A}왽ԇ?FPy<I_fn,r& Jt{9S}S޳//j=m_runqM?ȯ||6Kw /~rHȃ8oMn-xfF&Vk٠L}D@1/20Pr>=>x0}_>߻IN}Lg/./THpL v]17h,G 긂Q3ޗ(vȿTsz^J2&JKfގ %L~Jߪx=4 or\t4.JŒmf,;,֚|C![VWFF{mrżk#y-8veqPUnV3]-ܱWb Y{*O͛:>;*O oNy i) nT~xx1¾̿j ̾g2s leYj֘e/6'wlgSxO n',X0K.}^ d ${ ALiU/^<@m"}to& ^(^tmth;u1m*WeŁFu3NOqowmn]+䦡DƹuۼT..}ݡOěUiҰ5L 1+۸ g3_=;kOUf!+DDOrm^9F]rN2ݕZqyt@c SqD Z ,\XÜ[gwN/7[{@@É4N$ 6fC$Sw38č`!l| qpq#SP\$U@G"Zˤ>Q>,$5}qZ4mڇL# EI}_LT4|̿It#zt\WGUq!0F*T.s77cK|&իC JHN" PyܯYu7naT){ܶ /5_T-lpR !);TA $Aq@\DZLbK===}A6R"3\ʔ A`єpYΆdʬe*eXfK Dq1,ZHJ8Ԙ ܨIYkc,3P,s,IJqe#4 Y23?QcV :**پj :N -Ѵ zLL#/BSj8R4M Ŕ 'T~N"d%'k%Luy9FD,vٕˮ]v]VdRlD$ l9dLD,Q28@敘q-FNbjVᇯ,`1Xm+*%"qf 1:6l&jAW IӐe*LKsssFNE`V,q8B8'jHIjv+9 %ՃvX%A1x vC"EUA`VX u?9Z"O%zQg /ъB0WRzv#U7X"[aZUIwtZ]QCH=,ܒ :A 4? Dj;:@Tב4U!Q[ҖVT|h@(c7I4x Ing炦@gY)h"[OA[S b 3I*D\Э)O$oQOiWnx)֜rPF's6QAcT5ޏ*> Dwه-ӪM8YK$TWLqm[[yS N8ev 3e))$0S'ǘKu9SH֧3Uʑ3rP\li94:iJ2G5,jAO:  IũqM% 8C46a!kQ)2db"JoEh.nz Hn_|Mx0q.l@ĺŸG6sݠhvÈ[~+B7^?Uf0˩ Wm.7kç8D]ߩ4t|'И gp%1KK8VG'$w,| :Y ,wMAtHƙ3^]r 7CbRm& 2J{ö!~7>;Yd.J z$Jvtbגب`oW` z˧iP]";;% ZҰZ0Ul8 O_;tjo[uވm,S֙l7Tiiq25VNλuW\OBpQGmNvLfԧfjWҭ7'LG׍3w%F# @M%ccK݌TLhj/+v$퇓NV] ^}~T7[֚}E#ۄvnT҈F-J݈ꓴN:aDuREq$S%4oOp R[º}E6K@" D`c&9C|[ƫ.CO-Rqg,bY';0s6E*%rI\*Ց>F{J-/͛HɚI K:[Izs7Ǿ'h)W !|ytTڬ%3v [F'uĬTZ8UAQjEg|\d#NS{;dg6 dَgvXJ;# nL?t77]t3 5o{iY}֟vfBThEU01T> ~m$?~h~Țz,^SHPA=QYb9cS\:o ?}>mO:ǧD\iBf& 0(ZsM7id(#qEkRe>'MAПr)-_g&]z5lAo11=]ٴRԄv!~kgB)("Ofù!9jݐjyBrN ƄK9})B'PpJ)3jxaڌQ$Y̑K(#;Ue fA,7x.+ ٩J=WE?ؚoC>9JEY ꗆOUX֜=Ϧ `JwVMS nJ #m,v& #%RrFւg.n_Τ P9~av9؎q.[7I;f7w}83|=lHRGA7iӎqtawzs)$GjU14wWEoL&(Mk*|fz%Iƚ- OݽOR37 ]t)+fTN{y l@(PJ%OW9?op;oȯsPL[|:=v4ò# u &X%&taw#ak^'>w~WĕqA\Aw]L$&I81J$I1`[R0t* % %R;Lg}>_?/K6>OKD{˟F_3+=h~?\%Q^#79'n'# BPreuG!xB> `W!x2rqĥV`jqTr/=BMk䓖;̧Z9q͜zL#z6Y;OlA2m]S=[} X)W̚hv7򇘳+/E8c&wš^40CdIv~kj2 bF"eR%(0e j3MaZ ˙KQpfA2>:&sWyd@\$uwӯ H˯\ZS]p-W>`yW>`yUXQ+NM2i,IYj,0%)bS;U ,48A\M~ӚA8 !,VSk2Tbf@CP8 4I& gQꨴ23iztZ/P0Ӆ*'<Ac~v+n]рj`hEb,u 9'OERJ|*FKjǓuB&䆠%l s"#ifN3I3)P Jg+\bDE ԱTz1lf\&3n I,@K&:.3)J,E5ju"P2TFq&z\)pzl&ſw<9N%25-c&NK$ጦ4h20Z!)Bh*FDJhfS1̈́GBnk5=noծ_^{.ȾZ72jl3&cV˄20@9ȖB45>QtEuEV:Z83O|åȩb yvt"a GwFFuU1-; F|xaS8D]v]LJnѶd |:6TIb,kܴꦚap#+\j]ҁH-Kf%ӦA_HZ8zũ A)H2V:ӱd1Dfڌ ˮ8_h*I d } $$ 5Od=L$kvp4OR`hֹ42.խsGJ((%SDz#y/ =r%Qӕ bvEjsHmz8l`NPe3'U}a67iߦiՌ!x?ۧ@uqqٽM=!;_!z1~>} <=5 fwwѷ {?#;70v.~)T}aOA6lX/bUƒv y0fb7LTf#]4ȏw V{QWwު> G3d?7MJf{Xٓ0EJ xP𞠗cE D^vemz믍>1z hʹӺ&aB.Ţ^ =5B+D* QLJFZ*0+iOWa\I{AyMi%ƶ`-Y[EA\w4&#)P3 pXn\ZBZBtq>9ApzD')ZNX~߹ˑc];Zwyog>\h}20l}rp:MPjRzT0] ;z-tҝ1TnJ[4"6kC7'[٩Ws4ps|&UT=/(1d=ݢd}QwOe%F"Á8 dEo oCj.>-.O׭l *~ȕInyk$S#h!~iyG&v^>UкL1-q&u _洀݁jH)Ƹ'8|owz__5O7)E:><%V)c JkmR]w8g+qioj;+hNК2.]EluI ݦmTdovΜ&a&ۂ:BX^FXȵN FMyEAus[.b^*Vf=mQ0T׃Mp'{ lMP[` QL(Ø,7 {7 nDwV <4 Z8^ R4]mV@kvVLڜeY[/c9vKkήr'TRSi r.X'ZrFpK"uG%P:-)YI Xe'F<ƿF2,ќL!/c#sW!ҎLoܖ$5;YScw>4j0Q6$1'ዱ,֠#˽FPBt# Ђ*} %gӎ~걳ù1^'L;I~)GeNۋSʗ(a(2&ęgJlL,K/'WkQKs$9↷9dȏ܅5x-z 3E2U8Qz`3mCb905r8p1ej%O!K]r'Zq\@N2gה=s& ẂJ%iR`ߥ)zKnD%}x{ʥjI/Y g)\p}o%}&F2 CXX0#eR:ԣ\Qo\ ,WKA-vQS4e3<Au*Z>>$=h(_=5c^=7+݄`b zP,8bڟ㯏Oh9ZO}}ٰu}8RBDc;4AwkOE+!Tw"01RTpIF1%Di"QOu >5 SbUK0\avÛEdgGśb@vtǔܢO0S@ErȦΈ96Bfp%$zlB$5!EUl_Y cA1))bpg\:TV>ݫRJ);;Hj`5ǽtkF*$ƈ*4-2ck=->3S".@+lyF>q~?hC%$%nvog8[ :uk~ />db8%3Ҳ~ءHF0|PR+[t9Zmz /ȑqO'o=S*x]9jC"靊*cm}g ӹKwh" 9;Hu UY;}Hٛ=Ա0IҴ`.Gh^>hK9O\LdM/^H☁x|N 6cyX8eUZl¨ɜi Pm2[V4$V[㷬XZR,ZRjnY ՆeQW{p2pŽjAZj!hn!k$-+J*jYQ:x|= Ǘe,%R,%.mpy:dԺ:|!Z8 khZI:Z\P.[>J* LbQvO#T<$|c%H8%ctsf͙%7gܜYi3B^ʚP[EAc pIpՎ{QL fE@EƢt:$0+c=8!;B3MY <Au\2'1"ID 1MDMZy$ptB@%i]Kc,5oEX0r9QQ*VHA9*IN$8{5=e*h2@8[I!9A,Y9jq y$ߌ6|S-C4PZc̬: dL+H"/S9 ,@'E` .$W,sqhN:5Zw--Pkc"4t;w`ʜ(V&%}xC[=ezv~H1FKOW-u3oAs`!-YyAvV]:'&wN~)YxP$}%& ,d=y x߇f~Hm>>{(bP&Fk ckLN~zrMKBe`]W&#q͎N>n<697s탲jzpdyx2ƌ9%Z>lZ=1=VCOO'Nxtrab2B.wЎ3hF1fT^J 1ƨro4,!geͯt<"Lk3+u).DrQ)ϷHj]6e1b$*.-sXA y <ҀkM4ԔYZ,$1 "גyj}WY [:i}G)@fxfO5ѺZ]J 7ԧa0EzeFɳcY(đ^St5ip??{F俊?.[X/sf΁bl6V"KJKr ݶ-S#lv~bY:ߜ2$9~ae`)Dcy>*b{yů[%ܩ ˡXXV9 miұ  (g3y}Or5_(hB2'g3y1W]c+c c;4^?sZc\ȵ=Ũ'sׇ9֜ rdg(?;lsg1jXECHƅv\2:ԟߩx>vx>=/]~sc!Er!.BRMV]a^n:%\uW/4Hp@=:]&ٷZLgizq W.#j*wniqs ozvN| !"6Xiih:dԵiRQ;) nua)c<(4h[. m #>W"ܲijwl>u*Wa.U,WxA*"}FDc$[S"[(y !ZK٫;BBbRZ8^75W4ih:T/h|&}_qI%j7֭Ol2GsLjHIUO1W1g}FY"6YR;h%?Zc䈣vw63,=畋r )sa2yʕ#5YSvp‹;h 1k*ſ%Ex. !ߪRzN4oSYX`oeѺu&}Ŋ8vsյ1qS =͛MB&`qi3іdz[5T׋:hXKs;3j~L-?~~{v.azzpc4+qE3q_ݚE7ፃŠٻ5d'`Ȁ5U0ѹ󡚘*%R^P2 BBT0`si;Cɀ(Q.іQbgkRNIueʛZ\~Y.+o]5Rˡq#Q6^,jM%5 SC%ӈaJ2{l΂[Lq*Tr"!YIn>%K)6yIww6 # ԳCk@y WaE4RV!Q#,V|:5<손~9'Jdkc?nGU^cq8'y;FBjdi-B&jQ5~^Dq@W"uxuLM! "P璷bNTNבX?LLrgbٕ)Aggjh۲yv4 ȕ&31>:d-DV$ 0Lj3) F*L HaG,r埵ADV1?&_gT!4(D&tB# :)VC)*VDFϋf Gwa6raCj(.{{laBnO 8"[ 78GoV (t4Sy'gnwCWގO@,8X z?~2.l0GP9g_ NҎd&ӿ/MV] 'I7*;,MAodl8}DOXUPb﫚 N*3K&*/q®M{^ְt5qS0aᲲN1̔5du9!Ġ(K֤RȚbNĂ3 vNT>5)kRniMh%%:^?ЉhF+O$M^8k|OG WK5* -($ڪX)l VhKWGId_xr}ޔ77,L;I4&^X$&S}~Fn G&Z7 (YeQ^V7+c4Z6u7Saᮨʆ R)JR7CwI*3éwU}I\.Td LiLZD0CF@ l"qpHݻs V>&o+{mSeeFY}}cڮ<HIҮy("fÈЀmBT܌Osbehiќ\>ׂӷTN%$BVNJ-`@s$k,wLf#nX;Mdz_yw~yyq zUѮKgן^_Ljk4zz@R-Nj!| rfgC>`@vb#q(4?u⁁:bDJ]Y~T:K6H1{AyKNXv[  NM8B4ўG {'ŸZ!u6eJ񗛧H!B MͰu*ϥeE9q6ߦ/RQbwЬ?,V+f}/\(N hfeी!S  p[; >;H \ Wec Bʺo@w[7^Ԟݢ(*!A*g=t+/Qm}P1 qIy҅qqYwxOD69͔МGg7bGKDM?,99yhٰP]h j+]YM* )P'ߪn}m+ҒR@m@@4B 9U,b*F6- S#I #%J5%syyNqK ܫ[Vr\h;katRLPecVY_έYע?43 Fk%Hehfv]j,2\4W,;1^ز6(YQ-B$|3$4X #>Qu ޞ;ݍ]&F2@֦ Dql8A8V9XFyM&^ [|瑐xᗻ?8^J./TI˓_M1"$i5H&l[OCra{\<F `ƁT51VUL=;L0jR]sFAْyhys$2J(" Q ÐUT٢LWxA&*o:P8kithUu!F8|fQ;t*T߮?ծԞƵ/;GFBW=݄цQLj3E0hbMwƧA1ҕhdPZ14Plb&khVIz/Ϫv,lS |^pTmXl+f&7` OfggvďK%+Ȇ˟r nu>aWt֎݊Pi49 D(YmG+JFl Ņhd5J#WW렊6 F^!\tR EOy!Uxrbލ.rw<pAvW%B|.D0}*+ un8*  4RO$p^M"D9o38*K(Mv5¸8@;~| (Τ`PD 2!&XZ +QvRrLۇ"ʶ~u5)PLHRImp5 Yg{oTlz3n5JZ1r7X[&ԣnrQ@D6MhVL`0"4Gm>z I$;1lBH`mo޹sxE=iȇgwԜZǡ~:'н$9 k=J>[pwdjpŏ?χw[WXo>]}/.>pj03'dfnedxrO~8oˇ.v_~?ԝ"LNV.͗Y?1o'|Lֿ??{+|v RHf,lCmFw׻|5nQ\IK!5Q:Iv ?]I^RծVRSVrp3]*(4a.J>:O?;O^ڢżflD1WM: ^JxPMчOn4?vMɝ)R> ?:-\ŽUם/c8mxM8|]OrOcxM#xP0ûw_Ϛ^{_mOd Bti?z>}r7aa8*z -bv]*[VjشOo0 iᬈ.7flz8K=(_/ba+27K7fjޟy)~,I,9Ƹς4roVd9,׆)f{4NDTdia QI>z Dyқ6,۳5KL۳UDBo[07ߞ=<61e _X3L`QU%p:e3'. Z=KinMX(Ȝ@>5 oa-A#ͭocjŬ_@.c@{" /8>UF\/\3yvqy]`]W8g@9WGu6Pp{y}n8*ٓ^{gXHcj`|JPN|N܌z̽=B}l^x $Ww!$mୢu򽙌/ Vr- m,]k&!QGkDRN n7bKN>{ Bz_džz =}>O&,?7oҾ"3HAA RgP /FbnG摸 la _d>sf+ 1H5S:rWo[}զ]^{ 1%[c_ɮ7{*7|w֒φO?2<9:F)yaz^R"!L)"8" r?HU UTW5X$:+݇ ϴՌEe45i`1v^0,iG4Xcq% ]I:HWҕꕤU*hbS`J c ցJEC#ISQhcʖ5ҭǒk5)DSJuШ)i:\Evp]#Ds=燸3GN8ᄎ/M~>e,-9zx2L=K7=]q1 hFpGH Z#3V1d 3&ʴN6b\KYHE`~VR nCӊh1XyFrs ~]FEJ<9F)Q<:˱a ±  +cb'c%D[Mx{|5XS sϡ<~sW=M  ~*``3R; c.EHv`<*691"T؈ KD9ȔtH",rħ#L'wAR; #A#o>ViGwI d3όdv*9 rTL2|`CV{å#*y6#jAzkbМj m8eZ۽1[6WqINib.ݳU(snjҊRn-cP TZ2mfA)SpӛAz\uV˩\͌:]#س%N- 1?71N=yo߾m+>+)$E%w'5Dsd]T ۃTa{*lWB"$IW&)rzcz,HyM@ikQ_w`ERheGM_9==ܟ*>ڙ`-bGQ2tM'g+,Jdd8N߇rH!ڦ2@ `3e#lJOu>HD`CP>xbK /F']  V0Ң߿&8Mk 8 Z}U1QH @##!BIл(&'-Xr^,mGp4\cU#D"@Y"$!f QKC>0I3Nj* Qydn)A'+"zC:K-D^xM%QYD@.bfb"h2P\ 2" G7LqEbQb=nJh;BY0񑳰Ӳwddfa^v¬GBM ¶ݷ6/l+[wR7օm[c:x6G֝kd8o6G֝+d sGwthٺs_d3(i@j=swSmTGCI}GLM]߹-},&M7)aLןgzyV`W}&YsQCG> IfsNp|)$jH*yS&sg&S0Nc3{'+o>f9wNR0o%cLȲミsrcr;ړXUTw?7 /*igup:+8MȽV&9r(պtWp8윧OH4bd½B/i^T^^v+$Z|(uâ[NmE_]-n $iiE^-_3x ;m&Oʺg STNECMW-uv 4dq!;L eG~y=lǜǭ?G6f!d);'';t8@ts/?!hݹpta 6-[wg7Ge&G9bD?ŭ*&c덆Vk̫cӼb2`۳L4]UNZÍ]ukOq@{ƣk08E8BT,`˹\שtVRp_ rlRM#h, 6 uaPv<0r 2Rv&w0Br=IhVoTH,#ճg#&MXC?DDrDO7ߜp2y7KpR׽O1csf:v^ H0Ss#|ֻv4wmzNIr>rLkR/_wK%@$NDL2N y{?ODBw=g_HONzvwCf|v0<^- wruHN'8 5hopDԁ*< ,W]X:t^W8&yjި@(10I%&-4KջB|jI:BT}Z񂩬zϮ_:Mon'\>O/F+wXZ%x=ic.>I"I h|S+: !>V>=0l4S"8PTX1 ,HA -wu&`0p\mGF#1X9 [de #u *-hN;C 7ӫh4һT#THo1B b*A> 0" rDkؠ\&n\3vܻN}=?_N=i|pҥXipy qB0TrF ˂R+8o=>9"5PŹ%I`aK&iFIhxঀU*ޜ `VsO@9S0FY(WbP+/ʘ>Q|b40z#DNDprjdLxP)lp #xITϐ 0M@fQzryyڅaQ XrKĬ5<G[d3!KZEP}84l7t9'R\ I v':PG&`n89|echyu*Ig@,$,#!1:ߺ&1L*%(MK8G rbɫ ƽp͍OQ#ަp;cL.r̹΍Kwka!][o6+_fzf~ bt08t0DJ;ٶoE(dSK cG\>rqqq]8cBpu;g`ƍzܼSȄu4mf!}ĪZY;lzU MҊ:*ط߼^rz5ֆ/3DuyYFFMx71yX.+-JE^9eWa$p=KT/T@W}ܺ5r#W)(bEn-[Qj b']n=ӚtBt ˩I P^WKd.҇MsE*qL#u)HI{j*mpEڶ_^\XyXDbP ;!)KoB(=H6̡o.8Nΐ(-ݐ1mܓ}cl5f>ډyI p` V4_#+VD83pbgI~n,Q*c DBa"chǩD`@r-Xf$qHm 5;5bqΜLt>1ĜV [BG3oյ yMڃ< T4 $'hm+pRYx}ֲ̝3S4JHч4R']J?rt׍N4$$ Z6j/ĹvC$55n>_+n8]2F!Z{skχooZ{شtkk1<(Ќ\|q" d~)4s- iO7̖9FO?{mƪV"&*$7c EAbe[CY Iҏ'gB%L8610.bbL1%1Rhd"4RJ3ŞJ? cK~~ ;T6ՁUj9⨏%8U!UP3`JmC bP3)kRe43h8W^?aL&33%+yŤ G)c2VcʫxBD5vgꭿ0YK!noW˕8Cf Fٮ_dK>^?+$_ʳ@>[[2ɮϵ&vymuÈl; #'0;5ddP[Rr):#ei暫t7 4 eBE*YX us H}n>*aQ1\C >j ##UC|آB 1%c&B{ZU-ʈMgJW;YHT'e$͡?A{ő$e[r azd"[C>dc)W'ZJ/AMƕ]v=jppl9[E_t)&uU.cS W Mj#Cv--,Mg0N|5o]/OF8qOw׷y$pd͈P8Y~ma @/[>^jDh{&ϟ>ܨ:$wbp^VwiIJr0.rn^͖Y<!.rmPז<Ջ?&uEAouD-z4ɗY`X^-9־ :pE/2G8X] 6G",w Aыb84;/;WKBB =;g K=\9aPvEvXU8.CmC9 CvDcpoRMP.ЧwޢyrUqO϶Bq)W@`M>EpC,|Ov/#:Uk$&@!7wO3ᦖʀ׾R6J@CBz^p >_lK3Ìf29" |A`/*㑠¨;.c9\R#yy6kÉ]FǨ]}p_0IDRI#1SBdGjJs丌RHhKm۫[ݭFt# ~/v5T#mE j!Jq;pHH"E c4Asл<za&'GN=SGDNjHCgTP;ltR{]bDm$8#w?I8Q2F߈4XVf^q8EO"ip|iz*Wnd[ VGs^On3Tw 1r}_ <`mץxksȲpb=V#q+cg0:zzs;3/"PS&*.-*p@C[ w!k\}sx0n(";*1h5a`f~1ո?O @vOϣ1q'JO/Yǧ`E5hl;CFPT#ܙj! `-:؈*,(C~溮!*ùZgv$de%71nKA|_< 2 W{:  R=I")$dEG^si\PX88ɢtpu8p:2pV$ :xJGP01FOl'v<)|ܕ\%DHD-nT"jPS_}HZ A$ބ>('4,(ARFff,DnuNm㊕{9ђ1A=?k~'Ѓҟ| ˔Kuo/|QBT$QRHpCE&e0h/_l䭽:8OƐc3 P@e&P #d 3ENb$ՄKt,$u<ۻ]@p<҄319f1"*{tv31ͬ1b*a]᝶ˢD{^/c/ܽ^(*2DJSkaTPc)2Du{k"HqfE+K=(DKmox!t!!w{,Cl'#{?MJ.rSԵf wyKt9#wplnQ96Pno/<SǷ 9hIk5Ua%̗}1v%c~r۫snKs*Yc"h]"X@lYۑ](ܑo$! B*܍$'jJ%9ˆp6Z0$LR=,Qu «XjjFMR cDa$6&@1,T٬/97:<Yj1(vy5CBSSRJ+= A頰玲W:4B^H=[_:|A!9n _"-};'qKԅ'sP&X]81G1) }'4߰Ȍ5Y p7wd%}yR/>iKl=J&Z$ x-L/ǹ/,׏x~δuvw1Fe37Q6],W(;.Sp777rt>]^E3D&lz4T!$PsJv}/%Q֙g}JBS{ 98dL`]^^xj$U4PrπF޽r@ޕ}R= B 9 <-4[Q j> uAY@ l/D=X ~4'k `ʽ,0B|%wF&==qE -B["B vN a``O`qy Z)] O6,-!52ؘk ÖM?/{;-o)\̼S#`*& 5Q1&)u 6aX YRET#.)Zd1`*Ib`%GyvG(=t>zn2?",tףzZ6z?~VSQ 3)anR^;JπhW[%7Ր1Zvqp| 2ב]-Wf}"o"|Gu.Jq$6愐MDgD\h8s MT3` |%Iה*q  1Ϣq&( :ܺҹu?~~`s^G1E30A$J:XI&)F`HIبi" HҌ2-Ahh4&T$vu5?~ D۟bLNDƍMrBfhaϏ*Ff81 88řf@hcW1Y!1NG݁j͏R3<(,s q^g8^DD>#8/t7K~c:x?~;>?jjjj([bqljO:rW $]ko+?h%_Cl]tYEѳ(Kj;YCَ[BY-7)M"Qr.p:VΈyԦ6UmTMZY+ jcYp[m6U%W%W%W%Q^VVp%Хyk'_;INIxeWJemyԧ {9pLDZ~^5A|86JQ3*$R؈XaJrҜ?Ɣb [,҂%Ħ11ΚH8gۣ~PT.@g)_ lyhI =0"ySo?G)r(qyF҂SEM" %C.5ՀLXQE 2%݂ODF>4ڇ^PkzE)FIb$ D>RJ0UFRfb*P V&:*XOQC|xf˩4}*AyAp9CA iނÉWTsqJ`Ŝi:4eޜlʐR{e]I8ܽ~so^7w3 Ò4v6Mcj +)Q7ʭ\(HꔑR9axՆ`_Cُo>'Yu*%LSc0p&%z(P:Q“`p<s2cՓ.!>Q!oȅ5,7[{ ׅd)  w1F*]AQ$0\8Cxk.Nx_kQc)h'2xM< L1 NzJ8oMG"bBbIi(g1xˇ'X,q̃\A\ڕ͇-t0LԵ^V ;µ[p W*߹n$e֘[HWtgR v1 $1f K" ATb8W:# f *5Fgnۚ󸅔k ƌ,0D( *)H6Q4I.6TaAp#݉CK&6!)&cf;c, AMA! MA(/TK!7WLdK &*m; őNx3,RÔ(&4qjR 6 9^K1))j˗g,gSOΣ ,@Dya:#` hP4*LS'q()tAe:5Q-bm0Ih RzXpL3^wkfT;Wc.ɴۤu#/h3#1#%|15fo}hw|{{ք +"DY/y{5>~h3&x{{7 __~I' 'pm~kp!ݳ1HcJ:lW/qKAR'|H&0zD 2C)PmRƹIXĚX#Gʁ(e58RՒV$TFXZ%΀B4EiE4 63eeb迈q%lcY21/+{4YY )\֑[2eS-]t>mN֘t.;ӻ˟~^nSgרmrøRkGmkmXWiXIjU/wPL6WL!*d@Lx~gE Fqp~C{ҰޚQy N:d렅ɷQX9:f<-^Zb@ ]P&D%926%`6Xp?1h1h\$ˌ2E8^?]9N¾@ R`7/g\ 9S\{.S #\Ta}A͗Y\T!.%#R,S 1(zorIqr~󫎰jTӜ͹XC%UT k̒|YB6YBƭAAIӠǐơYBdT%GP׮0TuCɳQθŘ"׃bwČb; / a㹺5w+- /')j=XIEj,~R,־`pHxEb!Oq1sDka1O ֊m{ĥ dSRx.Ăbu%(A\ '-w88%"q&Y*IjY~␃R4)JTfXI`VRG,g)p2I1sNTRrM-Nx`6f#yz36=Gwóf!(7'Y|"VUQKX"pMԒ%L [Z~sD9B 4#*T3&i}^$1F1a0w vG8\&|}WB E;MFCrGl>8t1SY5xFJc.נ@og'/ǫ,CHcg㊗* 嘖;-1^:`uA{~ڜJպ|U+Rh4e!}A,a]H )zWyz5d3hnqvJԸ+C λ5>YF:wa[GьIӺh -A#Ye*E\becǟ (UnRi#7,̻`I @dSY[Rk7ϩQIw%(XqPb5Xb.b63xn QtݫT"mShn*>gm8*FBv+%EŨ~ 1%PH%<<[Cٽdb³/q6쟘 L L>-/X@y .G\-!b"} lWS9)Z{*aK<'n(y&T}@`-z,Ȣswsy=n.aS'0b-F\3Y1J%bH?5IIy27H6CgXӺԖ\|YYRySJE]j{J$v7^2Bk Rly(Ze.x Qs9\A-|UHEyhe[Hvr<*}_ŬR^"TMrO3w,~jg;sijZ0nƺ|HE#IDq`Vi'2u숢C OON )։0 N-q TL'pK:Cp,#Lko4ű:>\RL) L \o'ҸKcKc8Q1ҞYa/ eoߍԴ;(5$~dF;7Κ.$mimc1:Ƙri6{``VLu7t*!8c>' MNƶsJެ[5zI|WXh966X`ƚ`vK7ht v KZi6fεxձn~ :g)df6mڔ)" G 8r U ]Nh͏-f" K,8BqQbIdۛ>q4xWfmgк[*'g _M47~W0F{m]swKv Z*2а\,PslqjM)AFc._# po|쵌_K,f7kjN=(]@[rIftt>|ӈzFCcϻ3e?ڧ_^Go}]>U/fu#՛^GYo8n%hq'%T0𴥓%uKhPZlI[\wSeA~p녹5qa\}yR ?'A31|}u{w;~$Ocjכ~]?:.!xpuˡ51ޓ'^?{W6;/B@>d'f &0f,i$9L0}Ie7CZ ]M>n & ! gg.fL@ڥ7st_/&I?]̀'! {w?S72]e& ]ۼjV埿g5aCdz}63}.TK dF=Y_S)Dq582 \*a^p:$ 8<_ PAH!Zѵ<_`v&E>m4݌i.{l}ƞ%ޟee&u| ?{ eAǒs_ſ5/Z>5 oVXj !Cos<:aPQnasd?NAi085$oSaG*/pq7b%ٯj@ Iq^I^}Z'2̤}e Lf籿+/^ _\>E,SsŠiAR1j~3Rh',vsMVډcw{E],r,KWuW.{lq\?|Ǔ.iV_QZ::돆{]̇Ln= ?޽{w?ſ?sZb/|'%qW_gn.]h\gϭhQVq[%'Q%:=zPiYly\5OzY^VX^QAS&1qąh'@"V!D0:2!:'7,_ƹmapd&Z=1_~VMj,}u,ARYε5K$2PMHL,:j)!$2TڦZJQ6](>pz/#.yT9Іs`i8|{owo/:}Ǜ$*qK*%n콜Z:nCq4XlwعDf*H܆Zg&-EAS޶gvMC,B5(IûU#0mʇkos+tHNNjwǛbxhǞ΃F11iǐ( VECh4kjV6H%QO*P>~k»Vr)c kw]#{RҥDnOR~x *])6N/q{Krb|t>z!@bx|&yΤ2oy?M6i)^Y2FS S 'SjbJ@4P"f1i}&~©$Ie12Y1UDXS Հ|+Q>dU 7_5Og#ۛ ;I\ &1U Op3R3N&Iōbc.{ˋ> NY370x^Dj\R$NEbDtC ˒]Ev~eyiޑ^b{wF|wֻ.|mZ)hWIxUnun'Sy%8ᣅK4;#N'㎘AoO;pGH̔Bd`cUPpח}4bH'E[CJ w$";bswDp3qm$!I~N{|ȉ792}]ۧ>-},I5!vJnh<Kk'(_)/z? ӹ;)X*O!]~ԤEN'^R֥h~=.(kTZ+e07KEdJ#2HQ(k(Zq֬#Y]\)?!a*u+SL5^Ȳ[OU}&o^%y~krdRD[1Mp8"EeyE6 -šo-2Ũ1~kFwyʱƪ4a[ zSvr8b2j-,sYf12Wbq׀*eD%ҭUI6 =sʖ4lrKڎxc q*l|?/YJ jtcW[nnZ6k](KؼpwY8D^jX̼ׄO;k0lyc$kҜzx5Jn&)QED8b摦G&!K%R"oB.)PaͫPP40[ja0 3-#sȧ﮻wiQkO)`gtrIúlP.t]ƥbI\ R jOɞzb"'?rh{LZ<ЅSYަu~~=$63&nl\nZG4 Ǖi(WaIݗBHphNh;}5 znifn))B֭r7B(ӔڷV/XEjڿZ4 D 1a0[u"]}?F/o^x꬗qz~c8>L$I¢6g&Q BnrTQ?! n˖-I,B}b /f{z^w_OȺ̥0g'S T]#5c\mH';5$b&s0J0^BʚXS:щPE@XRcxWߟx%2?Ko<{~.{W33b0gw_A-:џe,Z&LJ" S|, ]׾ߵ/w],WLĤSR%HAO$N$8%T;h*0hڡ*?xSB~%no}WaEю9I뛟_|CKx/}짗gaA+/1gg/q8)IM)J91Jbkb 1i,-1DJJbig4f:=rr%Mc![w;T?k6qaA2 Nc`08&uSIt qJ:õU{EĤ$I621,"0 #u# h1v2\&"5\ 1&I;]­241dJMs"֨eRZ{eubJŔȋ)>bJrU7Jȁt*@nC^G..vE"i2k,`!%.hhbc`tBM.x [O&HVAem.n TR4lB0aM) a(s$vk"ҏ\,ij/9ϯs8*H:=E*߾Eu*5Hm0G=Il_k6#Xa;BYqo=h"lP0i'.Gb5/J|#&uT!b ʭaVt9 @`V}K᫢3UߪJPr O/XG0K7K7g#tp _3oy~V󜖷nz76l|_$7fЛFst1ŨSA۳q#A6$:/@ n/"7 )y"d4Ѱ! RuXD"¢;"0:IXޱ x;E5݊Byg~XbԠryhƭb8R^2] N2a&L2;ƳGJUGJJI"ڋR<\W R[dXg˖B2nMaDj+0T(HQDe<8B$( /:Z(&'t1 3@Y:TKJH h3Z= ZX*Z?@U0-`'aUAVEx5C3A|ѵHˮd w A@T=3k0fAf֠2` -,q̬%Z5BV1xv+TdӁ6u\0,SnXc-SL [L{l.[do.^EUVu ~(c溲Zpq!j@ԐePAk$P!㱝R c$NR NIĜ&V 9T).N ŠXS"GS+"fqLQ6D0 b+5 jSa5DhJc84HBq0" 6äב >t%(ɑS" & Fb%^1p x&Ve5Ua{ &Xg YGuDd!+X&3ܻAƟ!qxezܸ_3_ㅃur(G#J"[ԒH=j_Ub(ke5Vb; "W0H` gquM0?,j!l,E@> fP||a{Jn8r;>zamOfp߿la||)2 .]ep͓+pl!#1yEU|Ec{3&2S6yob8W*4;&~Ҕ /ƦiԪٜupJ'MA GXY^v*0j]ʰAIS-N1Ă{j.g8%:=/u ~qV_w\ Nګ_Qa \obM_cv8)lU&y9lq#XNx#Od$nQJaΣ' *]53z4/v>~kpwM'e0v|qzS>zhzbWYQ `k6 -I$eJ2g ԑ.w %{u G>UA֜ 9ɠ 򶚬i?&KA UGFo$ ,[9*x&\C`%22!gT8@a(5Mv}aQ(1Q "5ruB]4%#-Z )vOFᙧA@xB-eڟxfh=s&A."dr\JlRN_<:3ɨ)پ G/\ߥ\{uF2uM x禨WR.{$*^bRCsv,FQoj&/   BW쉦i±A8A0 p[ml&s3:^ Zq~1Rc}?N4I, 1PIZ9p,65TVĖ'0^גڸuz`=|+|p&= vWٓ`l^o˴¦*)ݤc4ۣQnU%Oσ\OѺ+ 4f3fņp^ hsMU:Rt:NK/&g*AĜdžvgPR!~7rU[\urUknlMrVƸɼ$[Xg-;ugCh.uȻd"xl9HHx?t~Ft2eL>"u&QFX@ؤRENb4>Dhwթ1NkBޑִ#| J>'5d=3jS;3~I[ৰD DO$oH͚TfoAQ!cYYO[ykqg%5NfpQThkBcDy3ժT%E>2w\2䊲.@_F_$ ?7cyfF!.@l)jRκ|hFov{C_2zVVd.:8<%I8Ay4@)>1I޹r2_V]N O}QnSqwNixwvY\&8W׌xolP?p)-ݱ* Vcy**؂UDġA.iS-Fl"UOb2H\.iļDyuZH1TmEWX^:B~ ,I.g]oi5_/-C&Ys0Mb~G؍NZ(HTVs sZQ!o9L$gX_ 6-uVgd@Vei56\]0woB/z&6>zM6$֬ ٌ楣>z: qlFKfT]QyQ_ÉZ3k&ODIF񗧽hYm;B Y+O b7df~:8?8Ah9hIXʩt3_{!cFwRDH:0ol?|>fߡ?N'YeD1sz>ws`C<{gor/m8 m嗟/R#j2W\jG}Y*.bDU֚B*֚1__v!*צwW|I WȮ7^E'BD\6H c㪾Fª ~T 8Va)"2%8 -\C Sp'\ȓr,Aػn<-ZZ,&v"(iЙoa Hфk gm*KȒ" W u{(ċC ^BNCx&IcY$[g5~a6SlCkd[*SD)'l F:ADܸ+Vr E8oQ~ hq(__Yp[o@1ů/>4\18A ./pAҍT켴x'م@+Tdwg8^; խ;#?xEt%[QEi~ie캀I>ӄ0 <68" ňo߉3WhM}&). x}IM'/#u`e0 'UEYWe T:EvÁFn1^ıJ"5{a˯ :!+4RtRIvg8?D/.T }{HfJV +n'ZB}5Wg뮯ThH.—|GD8hXGL8G$ϔ +rN yCNK'Eb_}[`B ֻU'Fg*9oUBKęS5Bb~xq4tUhA n^K"&W|5R\Su9'ɯ0J '{guve #ez๺WdzfzDk DNWܺt+((s ә[2Fh"kF|֫ Uy,Bz)QIy\]!?əF u 5&iq΄'HACe%?wwN42#G3ndӯ]'V[H%>N:ҚCMP,]o O\x[9uF ´"1Qq%%4(AO\6GΘӜm-af?.~m3η^|ȷ|("4.ERd*os^J a*!) &v!Vqpfb=AUe9v[ üT\N@pH(xmB h@'3< &H!)_bPCT^giLw!)*@R9d;ƥYdUهx榟a}^;(c 60{kyP;[DqbrjthV4dSJVE#bB5i~G :)FGyQf$a.daSGO'F<Gj/1(#V.r0 S^A?ps1ԉU.1%*8J05CWԢ)&ΥMmG,OhE)ܶ%90s@x0,g np>t=xͰ*qmXkT6igd,L4{7bfzN`D9Vm>y3Sp,Tixۙ+`H' IBlf`F1T1Z#)ոlp, ͙d(Lzm,f1 X ҍ`јݟzP"rUe[H3ݝjtyWH"I8$meFlsӔL$"Ok4(܎~RRH3f@jXg 2VHҺw  :X( ~jPX&a-~7[0*)u A;h[pa{ݴ|OJ]3I/6|b\&"v6%]k Hk7r(6)H  *b57;A!3 5(YGHpl2ެ\OG VMݦ5z-t 61 霘ydғe[@yxnTdŹkEV"lMHAյ\20 ƪqESZ ʗ"[p-jwA&)@+w!&) 炠}0)Eb3v耧vv_0vQtγ~dtooi{~1x!7"]9nUQ;DpۣBRIl!T^Zӭl2p4@,KliDY4;,Gh1ѠY/1kr+w'\' k,*wfLb5}+ ¿s܇V/}51+4_g_=~W?i_PFVzFfyhDDm?=[0LilNX3kzc$!>BU?9fsJ`F!<6M+|-[+ա)70,?#D4'k٫xs&+lzҤQ-Y3xlY6|E,}fUj x0ekeRҁ%њAňz[&p;1x)X*\}8Z#JLQÏ((tL v?aҸ/o\cEXnwel|۶MoP)F'\0~E ]IQ`YVvBad6Voc5=̉huy؉l?g'׉_~_$< &iIY\\0ܢ`fgfHl>a̟ZlpeC)T4ag,V;pQʹB&bqN$(%kvR]*kt.Nt;\/(AtGɱ%'b< ]Cw8ḩgN<|'͓b g`؉Ix(t=q\h+^'v90|8Z4bִCQIoamIQ_w!.3 :8Duцq$^ _*H,*;ĹF\ Hu랠uJ5kl6RBY1Nbnfܩ{֔Bxaӻ^5e9Ƌo|qs@\Z9$0PQ nO"|Q KACkt>b=ki 3(Eu.*qlǏ+l`V܇y72.փ`2/hA!♃Zl =hޭ?}W$S: [{ı.׶wBtkOnCa6CJC ,/LqD ^r7z 9=R'uB҃#*?E7}D2'uW'+}=m'Tw,`wB~=]ѭ*Z"-gU8 jCPt$(Kq쎡:> QTW^bƀ^pȻ1=}czsJc( >͉At2#2NQxNftgcGKDL=|Z%"; nF3H(%*>$~Nϧ"O~qa\Lp1:=I:>P.N Yjv3XMqlG|~jz~Xj5L(ӹS$}v3@aXxs)u%r.v>;s򱒔U:12|07 ]=BeRJ >͔R3 ico26RZb"Bc`af `a7|Z5Qr'`P" rA17&Asu1Rެ~ٲ ɖCsպ{_y+;.zoo>z? L /eºh3~% h(m_6w3-v=n{>,֋܂ o-$Mѣߧ5wqqoǞE|[-n Pl:8xɲغz!87rwԑz 0R B2SID2%"ey"  IL} 45@g!1@[.0 &9ɴV%2G-Ari &v)$YB'&'Fsabjr )gD`[RaYIʘj  42Љ1\އ-Fi PKTbP :#0ֺ_L'뒶0`S<&JSѶG LB!N5j ^A`gTpkGKt@ W#=z 7\ZP{b"1P\{JJTunNSҔNRBPJ2 (T 40Tn&JPb"pv/%Zu>ײȃܤэ'UB`8.~?R[צqE8{\3[wl촇m݆@k%8"pSas?>DNTJraDƵ3ᜥJҮ3sIEܺu-'}]srϐ7pۍJ M3PQbT Y %̱OZ K#کo1Bm ,{@T8\/-Tvۑ%M~@iR7Tx- מ!܎Vs }q"CZv} |boM>aV%Ce,ى3vkq/C)gq }9lc$'$!6ɾYJWI2[#cI{k{,4øaFRIx4;E,)c\څe>|ic#t!{MDBXB›CJϟZjv~Pӻ*zY$2G0:Z '_x =b)Cw~c%%bĥs u+RjxXa4dG7g;._d'3eO&$yv?qB5߸A$?Q?gwR7.{4qWo'N.^$ Rg'D彽dQ28~Pp.qPeup+|rcVeR iǛ,) H:O:ٓ%40'/|j85/.EIf+jtrqLy!j8F,X W9Wř!aAJqLSq%rFH$UW{dz0Jq[-Zi씐z̾|g lfM/P.jig>NO73ɕCѐYv]&YA4SD^<=؟ r6͊wOfoYQ>g69LkX,mͷEnFm92A%Y )eL@H2A R˸XrTaD$$h bkimo&ӻj F:jҐƬ5CUܡ@EգۮDZW; 8sDUE AH3˱X9Bpt`#KeE2!ʅQ |OF7<.SnGYK( /J\@9'4b82E{1&)_!cJZLFs@#.Fr괜 O}Zy"4IR'tAvj=kKweӿIwj ųSYĹ\u&^b{B~ZM빩b:EH Dΐ'2ȆZ4D|ֺS +h":TTWl cp m$grώ @8 _EݤZ(iʸ "kiv~C U,"Hf?b =.@Aخ3<>Oobgm?{62qK?]R҉Krm3S2bd'Lzxu%dutmѽK/|ДdNI 68MJ5Dku&*BYŽ?\5uډwj~'F~'a wjh,LGhCbYđY6*1HNoV*ž+Yɞe+W雝 C 0?i~MUaΟ,1odOVHx:n3JыEՓϦwҊa'^f 9M'þk|k&,VzFhwAi]P]P#X:iha 'SL Rz0P0DB# e; 1GPj[>FiQ0m6|0m6"pƘb u߀Jiōp?恖 y0 "\r"\'0Ōy6߄A|Wt7h/x܋\9e):G&89gVˠ}kܸla{WvW$7iȟ )Ί A,IC^8(K:2Z[ Вlp ;Aټ:+o*b>&2'* Z X`T  v$J2CadZpF(´0"f /ܛed"l^ o BL7!Cٵ ?WbC@<PpZ5X($J  EMBra5 )#,%&Fqix6K#ep$BFG BX;;@\d 5@"3!VX_ziUqyT;-YjXuO}29wb)̣<Ӯ<Ӯlf@k-\1ZBvVT BeZZÜ͌@Ԛj\>1 [dC["\q0@lãl;?緭)Le=kӂsD6bLJ e*^#Y˽f@2j&'l/ײ;9}dm6䩥K qa=g x;IɚdY#UXCYǖf41Ɣ@:e-}mɅ-,*{fְ\pOB~ϥ8AMIz,L)*T9dq ӌF%$&hav&`5~a _U+5R|x|8Xv^pX\#\ՈՄ8N(5fP%t[rW ZY#q)p`a߬(78;s_eh 7QO'U yVR6 ]Ү.DM)vXxuNZqp2 *8'E9d"]K7aLe s/FH_W;JU`K?i:[tZl%2A[ ޷7aƲw7^@6U*0bLZ0FjֵDKmT,`AsG P&Г$͏G{I Բ=_"c9= 8J3Ou(wC.\g"ҊkL@?/ߜ7^]r:ln[qڝڨװs6OOA[gazԺ%>tzyܫ]BO1ooGpRū__ri \͙jJZ5 6# ¦m*8) ~ ?e3Op.׳fdtSf߬NJf?p6dM4;hz;nԎ PҖYvUt"vޤaR+Y=|%0?%hzVsW2nߺ,iܘ,&~5ѱ,,+w[wtR}z&ߟ{ roF&L|Y9:dj r_S2cz1v Qz>E`Jmt}t5+pήu:K2 .vA:Iٍ;`y@P%hBiJ,'8SVHdrdO;xMJ&6Y)-ie>ā)x$'>LjAS1%vq\Y.t_Bmz/]t+vl٣~/:ِWֹk+T&uz;? ̦WwMߚVX A3wSP^`l%#NAAW8rֶ\,w 탶Ʉ3tO i eKP/u`x[}hܝ !fHYLG ?~ȼ kp R x_t=5٣'a!}oN_9sq6[2y2"#J\K0 6`u$ym y!!Wl缣kMw{7ѫ>o=-mw;Nj64{bwKP;xm< .3j!gF揧FKKdg^FXeg/ze? #{_.sJ86c/r8RS;>*mۂ9$neNdur@j*PcYMWQ&^ؗߜtmp?Kַ.E/JUm̎|ʼn0Ef.˯^`?V^% \VBgXVJ1f[*\2ٟƯ9 7|/,5bDXk,޵6nd"eE@ vHO Ŷflّt=%2%RQŢȶa0Y߹9UBhJ5g{{o՚孭,i4Qis?޸2WFxoeU1N%,*r2o0G e9(͈04ĒT"8t 1]d1P1UZ*H&OԂGRJg ;C21̅)$nqr+㆏ )6M1kLǂ% s$$qץ-\Ajئ  N6Hf(/6đ4) AyLc*EAaiIcotz5GSխGHL87HYPx()E.ح d3dD㬀 YPJ&V9#Ek H0`CAvF3Fe{Ǝxp~p+B ̴DdZiQ-Z"le:cb,+D! 5/U <黐Uܵҳ/IXȝz!zy">h!X37ݚkB8aeJJ8NȐ2Y) 5{uCW eTBZpÆc^8xjlp7^9)8ijr*s_9Bߘݓ? 'umϳsNJZ_M@%TF6[? 5 9E(AS?>upc S;z!p m*mqa";bF2To-Le[_[naST]GkJNok=Ǒx=l&2'>Lg/Ԡr||< =4{‡֎aT>Ȧ?Υ`+a0i1jdc3mg0OAZUy<r3R1q槈3+HXoL.{n=Bz_˔y.]~50 %*$ȇQ?YX7QSÑ]48$D8Y9I[&^~i5$O_!q60,Ӫ]GǗJ_WV.dXq%YF))T#жЊj ?2*UU|r`d /]R T3RKXERC2|Th$i3Z*%MA"_ I8ޥHa cZ\WyVAԬPC,b&)WNx* .^|u¸=~_l35Jʙ•cpsed (K 2kWM<+12ISC؜ WcrKdܟVhNP4rDuR k(--Kp;ZZ J )gZPq(VCC8<LlC3 / !jŒIJJR E*B:ESDpfP,aędlf^CeV.޷,!"?@H eXγIgI%IdGܽ:" =‚;&AM%kLAS$s,mHsbRiFO'pLc*a&$Ɲs @"qDHZh,ё HbcwvY#0^Ӱ`BE b&\B Z Q뽶=ۆ/ kh5^jMSjʱey*5/圸aQzqTIk_($ըҶK+ +WY f_*NQnKZ]䞒J\̌tX([)fI`>nҰsM;*5M~4ِz``v뇌4b]㷼>4~kxHæP @Im.eyvw:NzC%miRz wet̬'zՈJ:$7k 흑54`W b-ہ;(9;(a߿ BFxm?Y\3 1\p&y? }҇QESu^h~޹Z\/[]_RZ:Aj7 K{soS 'o@{{ b10sS 5L[t|Φ0FhȈq*?W_:$}u9g qf6yu^B6 Y}HrtjdB7WΗ!NH>>ewSJwנ]uD!1~wqo,հƗð(2>ִI>_X}ZHf RPNj+ 3Zv:dEi}z5ܪGO5}c{Fi Nr4EΓOn)< Z 2 hO%ӠNL Q+ .G?J#XyS(]uN29q|#v99Lla?0 <)SVQ44V# Rˆrs0Yݮ9e:cԲzea8)qPZ\|(|c 'ѽ]C√&JȡM(Oi#\$,qM L1l [vЩJuxw-gdWQM1(YfF;g?Tr/>f3ؒE-tu̓ݗ5 '2r~ 3+IZGf1g6zaZH"GP D&,OD[AV`@*/u+ Y/y$Cj˚/' xHyR.NFV]v|>h:U8i_S.,-"T}rҌCm85"U6QQ>5gxiHcWp烰.>l EC.QiPXojF\ ${qÛ_cT9ksѥWi\tvt5>zW݋eCj9j_&-\JëFZ Lg ~}{`Zj9^Hy( GUXzJr_h_Ԭȧ a3e,Y1M'Ls>91]~oQw8,^q _CFzZuv>  n3NGoHT&XrBTFiUB%=׈ݤ5$h$rF*~7sGO'Xk-y/OGr*1׵?(#T"! AZR ݈,3G}^x71d)'4y wA >6*酙O.ݼrȘcktn` .Mr[PdaL/4PD$-łm=[P^˦fGJ iir)Y22k<=M8.^ztVW_O?a=!! 2ywy:fC4r9XC!.HtB!.bFF?;/8 *rT ِr642{g=lޣg0{4B!$j.yCʼn{<ĉ%3s@/2J;z=@ !kFj a"yJǨT"^ Z ̆mvhwG2e#߲<&|ܙw~* Iij~tzO{U|; Dń`LΚ*yeVwT]D8fDw yz zz}6◘>zrEFǐsIs4ldcpgC-A(|٧fH8"g=f✋)z{Z)Tsȱg{g_a譻uY)ba"pE$c׳݋Eˎ#ڑtgVBT:ӣUƩPjGsx?Ϯ a:qx1/]9]#D6Ξ/YJC1 Ά>؈APM+S{aOF5>}kjߗȕj],#)6s)F!QH?{Wȍ/E$/E0"AW[Yr$yf+meIZA2ŧf= ۦ":FUw|e$73ˑ\N#EXhw s^LI8e ȀIn@]8U/CG**,"V(H ]myV'hr$)y)0*KPN.B<(pK yrp"S%ηe7nR]㷎zL+f ]t.ȩH#L:)tfJW̫OAt*6Uʐ?R[R;4@1@2,*ȓ1hbR*@JGC4wR%%hs"[\5!L|㗋pÖb)cB7_߽EHb Vo]i A1'ͮ4qA&Әh2A4S~HWO h.7g'E˛tS;&V,9E?Su2_ o5mS!kRn1:i7_FnZn:htQkʽF;n*hiS"=g {4}Qr.TQ`g]V=Nmo $Dcj^Tθfm4PMw@O^nz#_VOҝ4VB4AJTQesr.渵vCGLACcZ%ogͥ=q7A5 AXL&XK11e-(-9 I;[`a|0o޼)Re[m.PG_? ڋT4e* VA_v[!O >|5t#[ڮdF +qPB6N9etKRn )ﲶeQy2QDS%!a/:c9A@/W9o&Yl5pj* !5Ў=J˝69kmbv pW.jYR^dY[̪ߋը띙˚&Ζ-[o_0n߷ZiΘ29 %lbzdml^<Y0d[ i Jogī6.Յ7,d[.)Ƒˣت2+s]ufͫ|w'~i1z4k4:F>VBkߗ+MO:A|.m-K}ZG@޳ Oҋ+v:+>HN! k!|ٹdz㸡 JlRz]]tL߮Yٮ=UUսxw=]X Y$B Vξ(>~7ws{q"igF~ۇ^'n0.N[^z3parwF(x7Ee hiMQ\vO 7F)jole#o?qw'QWӡ7h| ʋ`փ-/dBb5$DQxٟ~t@ 9&yj : ?n&ue)6Ȩs>`̚,L.1JFxt!% ]2a'iy_t1bT")%3 &Jl Ji /PPe!]lWe0 m".s/X g H1aT,)4Xv1qMbJH-6&9nw3K$߾X3-X Cbn0}Z]Z>#s>i"!̤ ij$y{qbm ghZ1՞=< . t|FOErEu'Br]W(Ď3L6ⳟQk Iw\7vgR__WZwҮ _OVKk56eBb R0RC j ilQKJ1AQ<_& wſ90GM KSXyu"%RQq-RT RVG* teB%ze4θAY ,_-R1Y̭X]5TU),@@䲄4kTU3\#`<crQMGUH@sa!,+`8܌U\.Yg|.IDd1Upa<$,e;2%w8ngN9*w l^t ))Ja tVy7•fӧr:>P.06hJ.w^YjP8S9=-<PK<ζ Vޱ}tЍCs')iKS+rʂ;9}%/]D48/☦g&]H!Jp~ǻA8?&zh Yc!P1/yET:E=Sqۖ`,¸_d\En&el֎>1^Ḥ6V$ ILC= `]PȎm*#7VƀvAG`=Q"gn{ۧ(?r{JLvxB&wInLnut7SI nNK9G),ɵq,>@x_E>~s[O<wJ6)/i(#]l]C\ŭ9%pTMi1e kdibQsYsVׇS I-%MnQPؔ*U\Q) Uzn*.-*QQV"L&e&fV@rj bḋ@ϣ2 ^s--cGr`V ˼ 02ʓ5C0*JOfފ( ѵ)@5d"_C rʒi֠\_zOt{\ӿ?7rzu=VO'~H+&ӔɗMWSϏ+'e䲬\HJeYH D}pIFd\yUARfdu3REx΢DqIk'&1(']7]J65eh\`'ėQ1Z *H>>zoAtYI3Nbt 2,RBR@J/EKKb^&+R@X⥄` z^JXxg Xf#"hTvm{wM2~h|OiIy5Q>YEe{;{]'7^i_Q/lSB[&eU(W~2YW# ++ {I5 C J*WI% t 6/s;[WM']l;i㱤ޏ(Ð ~LێiQ],r(W9?AFrށ@5@]I0 w:YN#8m(:g -GsIYSTx>5IYLWm`nIxa)Kusc$Ƃ+صQړ4z[`_q Z:Eۭ@_;ZsM+߼d)VVm t; ~gB l+F?(,Y cSkSvmʳgd XE2}g^T?*&IxQlSrb dža ^Ci/)LXB̩I*km bBvyowI(Vh6x%H$أHτT R&=<Q0yֽJ9|GJsnWhϵ={LfNkpu`N qcjwO%w$DG.CkN5Ll xpݘ-=MB=Q"Y\y-1" 6`cHitS(' LjN3|óaG -iaM g:3~=2؜ pG.qhng5Y`P$`g`/:n~=hR)gꢭ=hZjWٟjɷl(g 4W|K,y>\oMJV{u]),w#oql_ǏS%1߬\%d|5H]O{j>:PrIfp"e7;9k)T+qBCp1Z[/ |ydK6N.qaSuc$rۛ)Q}K1,K.︩߻Mѽ}vet< ci잯MMRM}sU`o4m=c4XJ`M0DL_N1'zs:P`n7V4u* ~|*]AgAD9hٕ7,xi-Aw/WÌzNaZzttG5BH$ODA'iIEHD9c $HpPi PD%BS!cظ7Q DjGp88"E4F֎)Ix(ƖY9 DWL2+P&{(7v@H|z ; e|=H eהX6kk&wf9#ҜG_HIKȰ?2ZϯDž]%_ jaFF] 9}'RM>tXF ֍ Xci&}iDP`j4R)CNry8X%D$DD*K!tE)poń>9}>>߈Q[%T@p`gh*Hj0%(v&]%Nl+Ek(I隩T3sAU6&%^1nLJA hnI2 {rsRgEB$Y{R rNW2K/ɵ. 7Yp:۬tzǵUcGWep"?Xp^ D5JP]UvA;; 8EijXF1!ldt6mKIƍ*<&SVΌCkuy*k(ͥ>L,=Xq;?fus-,-8j)jV\|jkO7?_Ytpp p^tzĠRLD+yФllVK`ܭ7gd稢J qt?Kp+εŋwi9zȧUg_RbrslC0S+4UnTm=ηZo5 ̎CF˚R6il3UV_hֺ/}L1ay0O WO!Ip. CaƱB~r@\$DNRkh鎘0T[$6Q+18EQ!R](-r;A~ ̈<Pe.6a8!D B$ 6&NJЩqfCR94/Etȗ*T\,T*jr[4K)%J'Kï8tXgKžj  ʔ W97tJ3IM~r|ěX3v QKc՜N$b6"-% #}Eh9F q>ݘ $i򌕡 Ӝ^9۹Ycp$m鄹Y ^B=r8es \q b\D1Ox(5IRGdў+fŨ8XQj; ~,'a;4$78GɎ;Q7Byю%%x|_4,p3Z׹64m$qֿWꉏFu/5IM x<~_RE 3"Fا)/xB|qh"&)M@*<%e)}IX5F'du;E۹݇׋4Ōd_q}:azO6~ڹN;A";.?| q _ ޸{ V9|}M]nV,4ײ(g:}I=4cS{22/h1 qoXL7ǹ:wÉ}Iս݋!f f2Ϯ̟%ՔDH荹j-֏EQ5I[I[I[V<@|%. ;v₨`u%f]a} EXc '5E 3*A?)̝sFr\'|-.tvg(l!E׹: {ѻ ޯYy_| $Il%n1dRDy!DKD{(5$15屓q(eΨTPiڢ46 Mq8ŝ|@\t2VrU_@sO&|z?S{I3;;[Wس_̅35q+մ[A*&a#P+P+)Œ̴n>>=Pŀ0e5ƾL>FEX#ȴs\*hj *Jf0QU0=[r@1"`a!T+H%.6gӍlAf.RPSR,T4m ҬuLFMpT MO_MִuFnae!!  BZ`LP^d%1Q>fU 6GH׎N QcY Vy ?U($m8;Mr9)aDa,c#XWD*M$^48?!2[ˉ㔥q2l X 2EZiseiw6/)7xoZ; G% lLDxP qqе P 艬k"DvU&Ə˫ٯf{p3q0`3j>}z$ޏA泥{'7v:`,BSP3 ӿw? Đ1B2'D/Ɍ5S4by;WpLV^ko Kh(ږf`'JhAjOUc1b4Ǵp#r >28~R+RGIː( Fv?`&2DnFuZ7Aۜ|1fثLR?JC|!~orCp=Fx+}2"Cr򌥺bBDˤJCh:%cI_$Qu%n}<|޳wWW׌;g8l,dk8hubs10ցb\˽qGc;J!c=UԞODͤIYB6Rl^h 30Zޝ hfqDVym?oVrHЏB43Է 3"N$_ɮeBgѼ?hqh0}S}&`E.΢}ʉT(%Z..[2~R`bQV ,&gTf.mj(tow/x X8fc)d9f a<fm)8 WWWm97 x~ckoo>חڿϯ9xm5d+Vvx :w\̬NؑyHRs> +J1F51Ejsp >M$(-%sRj8`'TҍɍKįKxkӳ FCHśpPڀH+l86Ц[4*՚1C%h^(zG {|6E(lcؚMJݱKk!뻴#$xoH'S寮pW11%mr7v/F m L`?{@x{L rLpqfx34џcdGn5h_xnnӓ-C;Vmx2eVDi2e2sGkE*iCjCvkr|xtpi[As7եi9Y؅uxtQ (~-ʧ63rXK4 (AMC^EF /Ќͅ6VzkJj!JH*$i h$u[Q{QǤ.g0ajnk a`aB5rbQؕN>WMZB_Tw>MZ7bgv g v[5^O)zbsJ KwS%pD%> CDl 3F0}gA׌אkڧ}ن*hr.MzS:LN5g%edD=:ӻ8%sҠ?/.zj0o}@ ;@as^15&=h:uɺ\= OZ;B?q(~(ys&MZkzEe xR|3{zp.&N`,Ej-eՊXZuۙj^@j'^–^֖HX ېdv0ɬQ$8qx Ii{%:`H{Vjr({\O^탣&J›?ܛA#AGnӳÎ"B,Wn/:{GÄH _6nUZ^ ()BB"㴫nAQKJpBSQGu̒8><+f<,ıl%ىu$߰{srZh$ F鑶90#`mrh^d:sh zw8|[^.~"Bx/CnCu2p LT/ ^WIdlMNg:.gg8 {ںp9]qu-wiY_)m\elZRNZ=ӊ.߼~9X$ .H]'s t pUA'QfڋZ+ 61%8JAem @Y SצJl#9$zO֜I# OKi(*ƂԠ-%[hc^g F G umT*jƓѶAd ? 7U׏?GqZndl޾Qoc'r%h5ܞM޾^'wvru>Ǐg̉ !.7GIF'bCER,UR,U2K+ (bҐڈl{/c2[#5!OKE f+mo|dPwf0m?m-E0dxŪd DUFu%$$_  6)aDQ`=!m "j:96 KjpDhLМL#W'9};\'lR߄/iQGvmz/ܿ- ~USUyHÞ?{Wܸ_nS*HLn2dӦ@ȖO'\%˔D R줦&"F<\_DnMo.@Z[J,므(O?-T37[E,;-X?>x?2t}{rw Xڛj#'FY<~$Xq8$8&=$]7rgʀ$KL)Tڥdk^XP`Q8V"\/uB.v GS[3 %l/@U3zf:!T  %Z60RԨ .9lhO=.VMk2pZa*I4sON~|u[])}f;_@":ꙙ7e:OS(ͅ᪜J؛yn|AؐmߥہgebY^_\}2˫,ھ$\ԋJ#qO_{匾lHK/e`²)kT9gac`c F̞G+<-D-Ι̰SO LjdYf9$9scЂmM,Zp]i .!74cX$\CJD?$\B; T NIQ 3?g7Q-Ԕ5\j")|ޚP4Ĥ\2{?[XK% PV<~FU 7) S+D"EjۡY@!؛/RrZ r)I#yuEx9:6} 2NV*=%0I((DiΫBJ\m:еY{ªj_3CVӆw<$<6?'1K|r|G3ah?~@}į?m0_I`&G3xSfH w=Ѳ _ H$su i;T29ZWM{(ݧ/AX&o=ӛaj,~o:O^lxNwtwku?~3lmϴO*d3**d Wl8pCf x<ÆGByry acx0ޞoC8oCNkNA»G(I5wk<@fG(t8K'Fmc ڠAGcR> ;m$ ے0hI*&*Ԛa/[ 1 pnC[o3zX?zJ Gs0E7ߞ8E1wvwSt3zp~'o(9F;!$%,U,hx  {&~Z.Cnxǘ!gB텇DW;z{G  7Sgs`_ IW:{t{DH!tvQڻ 2tcvR&t@^lmFNi^C";O^*_X"mBB_l-6<Ɩ0t=_Mm ѣnjGn'BZ@Ru:7kY$lsg -pƘb촨pIK+NO #d;V(ݡw/1[|wN/fւ3 TD AϵRiɭDkiǮ.fqmƿ(9&+SlO0EàHu4d;$-I]:>.uyL$IIa]Ň'テX5 69`>F&H>4ʮE73CT^y=QvSY\߾ۻ[6ֿt"J6)Nw'քfrTJ!3G1h ǤEv; 2IJ3gš0lCgδ !Ej~bsbm$]mͻԱ"-8Fo3z}Y?D1z-(PT&7=zۦDYp]rI*Rj͝Әe> Τ 9oSڠP5%X8S&f*N[Vt\[9p)R~o>Fqٽ9Į`q8pqR0aOo;"ަ]3kӘ4:^NQ`0N09upQp!5c ~Kc3μ@]w ĴޓG Fˆ9zH _|q֟n w1*p]Ȗ7E+1jY<ܾR=mo@ ({)iϽB7[_wK [!ٻ9OFP4>dzǤ}0P99x¡f)n[pjU`fബF*o r]34W7cN@痟`).(uGLcI"؂~(栞rG}p|| ~10vWDLA1=L@~0(zqU77&ہ|a0닫Ofy5eWW_/^i*ťN4~є2Xm>@n'~(\w${j rXs) ҅0%ʰ >$Bj+\?f7טp9;~|mtK τPh+69O$b;sonF;i rmZO,ma}/z'ŝYΌkHՉRP]~W4H6y'Gf߁9ՐR?:F.^D܍\']18V)>XS)FRy*ufgԐ&=-=l]q`dא yT~}@Vqew$)TC.TUoZ*./P*vh)$H:l YD%G'Lא/> !Ϙӯ!OJnWC.}œh)5$ QPk!Hc,EWj"U*QjLrc\U<~U# anU2mp55㤫=@1Dd,30nUbTQYSnmCaHi`,d\\*S('EP QF" fn/PLxLqn _TXo[>ꢜu[-nR3Y0m?nvFxB&u3AB^z޿.vҋ kB|9+>[b_nI=HPe}g2#2AF):t%ǕX^R~"e!e P 6U:襂Z_^+%V9\ZEQ_r'Y5ϟ:T)_RXR*BfN>.`00ÜȈFQ*p! >71%g;IhxRb'ń@v[~6w><&GZ,kƠqcV*4\3v5Syv)8UT Q\Suݥwfϊ_>@71]@ Ց%v,ɷ3H$b'1ĆDU%Q\ N0LlYʸ;T΃G$S*D.hQ 5;d٩Kb¹eas%E'w 2:S,X (.+-vkaE5r4#}5Ё1 h#<(ڣ !10E͙l0 rXk(7Q(ɘT{ S"N5 Y!j5g9A4WJQMYDZYZ9XQAKTH$ǯt7x,y ܛDz=CCIX:݁]ƊՓvyIn ݇R/g}ðnO~Jh>Hfm>e=z\cp$\OBxxy0]F ni]$`kϝ2ؕ18'SDސ^нb^P}͗'Oxx67vt[Xv}QOMEYv J)IV$cvHbS/?-\(NE1,z V=a/GWBw?1ME'"j!5_Vf3Jid0_sc;zk!r ,ʏ 3:9pLh$+w~߹>MiGK~Y1D\YX>2K ⅸob1-)F0v[eY_c&"SeYkLf 9f llj˾xǷ zֳxl[|g8i;s,^?^gSΗY~2v<.K~h90Cz%a2RL(co1S %ٱJwYٲoߥaV6~L q21BR>Q`pG}Ï)-\{xkur f(X?kk=hlC bVAV#fC)V19( % _5sT{PPJc.-蘒#LZa ]x{ X]v`'QRZrVxz[@Ҙԅl^,8qBHE5kʣP7U" UA >~]ZX\-K?/>\e%وhYoY~IH"CsNEgT!K "ьj紆nNÊ/Xw+5['N 40w-.3tRf g-;ޱry#휜jDQm#"^Aߤ35 ޵̄B)Zhj;B#df5 4JU)Q~q@_+g dm]@>I76|cVAǧgmo9(f/Sv\SF9X5t^Qta6ViTF`] bͥN %R?RJᵸMC SB4TbyQY'L%bIDT8e.8Oo&U HU#$_n;Uv4WhAx첔A]MzF֍<'^,^t_ }! %io+e qdo_*Ԥnv*qt-B[ws,j |׻w <ke /Ѐ5ǍhK.0Ûb)S=_GJBJ+P_b, TTx]P8Jk9^))Ws.?9Hj>73l\wmLRj¥u\̘\|u.NԊm AT}Sy"*QZ.*N҂9P (K e`guEN;A `hHg!EHVTUjxWFj;w0圴;%|zR)RjAFwje"-dM?{o3r7ޒzjӧ4z""RI=F" i&ή ەĖv&s 5IL5`g SX'v{sg $cKj%D{׈K^$6e><_=M~U\/-%@ɉ”D9!ήIJ #;qVRb1!Nb,6QHb1Q bJcpD$C!%iQJlH*%tJl%ᣃs`*@[ y%t3Ah֠Ap6`^1ݵ8~ZE!ŗClG3ۏҋY2ROҋmOzE*IGRk܏fnyC QD$\0.(,(QQ$SqM UڊRMjo5-|ܲzq[ȖT]5yUxŪo4ԯR| ϷSGj \V߭VXMxXY>>G_7b #Vbc٧:KnaoI1_Mܪ[\B^{޳JgTrqVyԧe':'nK2٭+)];rfYxS-'٭sCSg=̻h>%stB 0Muh_ۉը7աrv2F$Gxwew23!.12dZ]ؕIɤj}=wZͨ j <jRF7\{UX gվDx61[H.ioGżIx$-;+OLKk/ef^` ƪd0'>Z-;)OOQj.BSH`S"Bs9i$6W>OzG!k$hy FAT4 6Yb~؞G}bd7 /z?['V2Ķ:Nq_޺ 0Sj^,wy~?>|YbẆw?Vno̻>Ts Il<&JQuIYR@eT%(x$Z,U"^c,A@WZy-*;s`|iuuORO{6cǭ 0"#2DF&$_1"HW p{RyjǨT;ӧGe,Cu33 /rO/῿y4 $F6H>W,/X "b輮9[O]*xZ3[>\o| ߹O_~~ "3ٽ~{o1Q:9p!k} 2+'M=ƄӞ5FrgE H\50[)xEU'By3lW1>7pmgYw| }q4-1^a"`()\Wncgn4@S\TPQ`\̀y%+CRp^G9s@2 vPQ(Lxޯ3pStgG-9&V:mBsds0 .j_Rʁ+[.XͭE<=ǔ3=8fR3JgT!P3ȾԌ2($Hk2- [ IZa5 $RzDof ,LtҚ~j.Рe1D?(D5eRrUNJ[a\ ULbB2`-2Ⱥ@8CWA<E2T.~7׽fuk^\z3S*7{kN$Vi^]Jm!@yDj+Ni`gt0}TUm@IE@xo(~`DrwO*,rtQs)3\3RR Ĝp|/vTS^ҘP|xFuV|U$h9ttsR`\e-a-~zg"90B: MY>q)|N}]cD]TӞS/+4e(kZCR.hL=ݦOS)y{AmnF/(xGCU;.-ؼ(-I_!9f8/rd=O7n4e::chlRZqR#BiAl K=Oܭ W8 Tڤi,(;* 8oBȼb)5DsQ |qJTI&Y-4WR Pѥh8u#΋˷5z.^T–B᪸%@22*O*R EYBD2X0f@ћRc"%&>wFJ"kqy`VQCVrKƤj3ʕRrHzk=m|ʞJ;:u6qdH97hmhڙVyhVXϸje O=}XrFayIcԴSʟ SFrUru4J1%6H.IrQV~Mqf#%?H$SŒrנ]io'g͵7ק*X7ߗnL,'j'٤ žobSaD{ix *ipLdEA(s2s=op"T&AbR3ȁq(!y)|>f\&'зWb^#}] ag?^"֘'pP q8L1M i2Rp}E@窐Z *QȾ}^' ѳ@_ nto6Wg#aC)l|y~h/Qh&2ZĻ}Ev:,lܚdj:19rVB6ET>imo#*MQ0@^B2 lrc*je@.D3Oj"k϶>؈8j*`87D$6@]gm³H>$ow+3~[s+NþZj5JD3 @TJҭ[(s܄`8aGs n +9#Rؿ+1b%F ,%>k]($J$Iz$6 Cν%ݏy30"ߥѭᔒom}Zq-e03Gݝ3+B2fCQ!Z+$rƵ\ET8V jxY lsԥl"4$%5RGR7Λc?}uIcƒ|m%)՞]2ieB2mŰg. ::ePA*%.Z=H"m䖕"и(I!8!ǽ WT骿ͫWㅖ~( m9A_ z!U_ڗжsNlT@yr)aܨ\Wr&Pޏ,Ts*(' ߦ_ڢ/cEf !B]O9L?ƿ>| dI#g &i/B\ZI߷1m.c2h=Mn"H@1-;lȂq[2@ =%磷R6zwba@Fu':_|$EA[uZԏr6X,->ܤ_&O=rt;gݐcrw?dcŶ>_sӛyVoQA{r'obN.x TT0|'JYO9|Zm_ZWTr#O}VYvӫ4,1AMAh"% ) =CL PFqFD)iC)VE1MӪuI.ţ]R-KRAJ fɓ,݂x 7WُAVh;::9-qж;y(Xd@!y% ԩ@bxNڽ IVιjH Κ>N?RÙnXe2טF $^(-8έ$ }[L)D19}Gyb%c` ÷r+7*s_Oc9nXnj D9)e3MPASCBMP̆@ڔ:X m4 j[+w29Ie :@ڪ@aT[kXe#$Cb&hV49j'%1M6a ןA , NHg5BAD* 0tIp5eGjD5,s^&`e.+S(TWR&Bv)˔XzƘt&xZBT |˥+rw{ObH@ Y;~zVpUÀO/x/C\%Y\j~}sqrga^3$_W~xMfŚ1&?#BΗse&S߹YtGhhƋ=⯔|[(*҃>X0ԠONWEz/wA~yW<, l {?(U4~ 1LiJe17ȣ9#|\V+i3C2h -,ԁNkkMS ]fW=1zj2uv,H"E7ޤh@ 4 r,@PX̥) %LE ItXIP]LV;JJBĦ5O<)x[h͌Ƿՙ/F/5ҭ+l/*vpwc Ԋ3fy{&JE? =Lk]s/C8!]pAT}ءk =[> 2ӡԧBpj`йA ⓚ[Kb{^@$}mG&ZTr׼]X0,Wa႖l5XHB}*H[G]`)#EA,#!-.۟6BBd?$e =f=C^iwQ+Z;]}ty mJwe654ΜIаĢMt@+F.v]WZ,n'5H,`-n5j;p3w>I\F+Sm T(0^qVGSZs3JQ4S4]p\o ܎0g!5Y*;6 pI0說W5APk( .đB(ZRL4Iq;JĽw6B!ݘ3RJV0S ^+n>>[o_?k#A[L{Q{!Ck|B)[H+"UKKB\PjGlѪ>W9 g2-l_amp+,ϒ^ rs6i:ݞv=tўW;Z7{uOab.&zEk au{\gf:(Y4 o ~{WPmlH;b2Tf@@_p kI,{E/L)1[t2g[ۮLyۡkR 6Պe3Vm[A@Gf֭1O4Y}hw7wW- cd>>NFuyJαVdz̭ }@iS5nr%=Cac*! y"ZKr"}uv[:#jsTnsl@ugڭBIքr%SviL{;or%hA`h8لMhp%(`LԎa (wG㴰ULj.* K B*+P}lt).)>&'FT]: J<C_/ Yd2*:h{=8pxyC 'a82N+]Fks|}  ɿ}sKQG,_wF >TM& .ߟ\ZܔZ {D,˻ llR<.#WA҆t esш/!\"$BsYe8s͑1哕zF֟Jt)ZS)7S{60%e}_Ӟls;+?{FYlټ_C# 흗mj(G /)vR*bt;RxC;/qIRBCg ,C-rt(9$8}0BS9W'@8 d1O3 34P}e{K|.J_C9?v!S'Kb.+ /)+ijœ ogNţRw(IHrN-W4}SI'6mw@4\%J  $T0X1EBLP/%J 1["'Pr~n!c% Ujhx#.L{H5Q P֏6txM03A u0 (E0ArPV3(Skc9f 9_Iufhn?kgUzt6I?9i.\ ( EjSKB:M{ @&yɱ`b޲Dxa ^/h/RbNR[qc(Oo-?G|#h(="QPq (E J Zyފk3nuqQW1@]Q]-m!7v"wI{ ?O3)DʇCʯjɰZaJk[NO4ePmOP&Ƙ!0ѪCp\e5l~(mJz8l-F)S1TQDkve*5bjB+ÏKTZajVbZ_1ʅ[Z}C巽6bR z.m;k$Ect, $@c(rfV ۱JÈa5Ji,k윔bͩ3#`8fn"ݑ&My{]E-iCA.`d,5zco3xi5;q4HIs8E'ynjq퓿}o<4,Ԇ67 X71f /)lЏo5 뷩_V mx1z@o7#u 񂺪D\vcurIVb]rǷ*씭KlȁtfqX1ާ%PR@.NI As^LF,缎Z15YUReG&))cS7)b I9EBHFQ0;njn3 AрO!@Edz&{׽sU9)|+m pÔLBSRXd\ vjHyzwULvP#ûMǎ9pJȔmTc+m(ŎBE3NDV OX)'Wa)msJ""nY6مh=P?0Vo[_b}pRǃzKS事/F@R6X @pͲy-i ʼnԋ,ƵMPI\] BO1\?xHX}?_||`ƍZo-׋/ k^.A~ .oai zU68Q\`OogSwRO_|ojZ^_YX !F2jaVFǰ!`![+!xó6wfٟ k.8D%0'FLL\Č4\'~NY?O| Mu=B r+C1$x)y:l@th6hdIbV֒=g4^wiC0ұ,=6Z)ۡ%YU?Z v<@=$r5v3ltvd>t7]XN"3ZdH*"SY D 1]A$Rq8rRkG CZ+d gxv2"j9#;% %?2T˱!%¢ ##}T)AŽą hB4auBK^ٸ-S8US#n)4-*~" &ƌ9$/w0^rԀwDs6F$}ntMj,z~7@[{﨡erU3Us mrj[Gic冏#p#&>Up.RӻkDɑSٳ8*7/Χ6+´-Z/wMooBdjDƌ!eTPBTԍ~lI$0,'Vb-zZz|*Y/t9Y75 s6VJ;SiυѐBM45zc%nakJC,+M07 ,Ň9#z ^&-տy=?*Po NCq$p㉷DL6V9|8$3GpH$U<7Ʈ[OzwyGp{3n7t~$8G NXtu$6?.J7q8 PBy,A2+"{B5دqq A{\ˍ*ea} ǏbO@ؿ5w)[M҈,1mG4‹y?#Gi:q{P3р5׻ )K=l]ƔLg-ɷ:yfx~񊭃x%x%Q@WZYX )9/O{kO7WWjBe0c,;%JAiSCu>3;PD%.{s.],(9lsHmPL\tB&%&\OWw}|sod9bQlj?+KÈ\6S?:):>l2Lp9ǥ EpxFM86V I8DISkRNqǛ< ?ۣ07x~1œ}f;qւijsvCjl ў%biN%qs+EW U˗kh-s zXH!)2!BCTCLZ^vc'2 oHBx/Q1FL$\;FM,l;,!nNYiA \Z8坏E.p+ qV#.p 0.xXBu[h`ɨ\r#)r#mTy0.v3 i-NG u hY8G/x.v g)'%Z(OW=)# Ƒ{f>;X@XD̩=WG׺ypb?_\}UT_mrWs:Re:F0\SސyA (g?o6My.$m]EL솧>#?wqF}s!` *¨VP" %B棄K w\`R@P4 (Bsq92N 1+wG40$qn&€=0)ݴ u8E5he[fa99CiVeGӄ`2EyO6d^qv:Lo~Z]~3I߹y}ZGtMk~KKANzsw7?]';I~v &QE 㧷glXe9˿gY~on$`lDRK˾y?v4ĜR8_}n*E]!ElT0nK:msau:;քVУkYf\cl*1GURc i!4{caߔ22 BڭViS@>J ` ʂ[! l2*׋﷠]猰dKCGs8 _wY1q*ǰ1eΔ]`.9F p ^!5ӄ`!*ZӱKl3vXN 00HY 6QZR.Ft$AӤiR4sZUgH!)T -,,542,jLy:vZJHZ#u59>'gZkƩGUd"\ &L),% r9 "(*hZkyJo/];CRp:YaA?uV.}y;opxvy_nRnsMY.ٛG>MpՈ8i{rWs\:ng 7g˟>j,.0j{?wbÏ??oM&@9IcPf2BV0Xs՝W _o1 oïTi4S3fK(Y$*P_-'mnᦸ/gp[ʟd,e\}yg@ZC'!fҨܳj$85À3XIQ#*JZPYRWĖ&t}auAaqajʴY#qrPfASM@wV%ϴ2sZ饳|u$KǙk[P_a5w,ggXg5Ҳ$2f4 +kq^Ij[?]H'aTMЉǐt=σ^k1p$j+_ qEbB@}+ܠJec'mnFI|{1 lDFY( [0#5ʄxw֕^=slw2s,ˇ֔YX#fq4P pF{lFՃ+^5J)`4] phȋ\]RwO5P!(Etk–s,HdNF{EA{ýL[hΩzDh+t4YI[Gȱ ttHkj{Z4i1F @K11;\]uv@ֱnŹ)JSLs(i푴 '-JظT[!v+"'tu-Fp8;DO+ ˳Zab&=PH 51f-KEpp, &XT,.Uc.M2 M_/5k #)C@q$G` qUh\x :FDm"PDH ۀ'+df$Pbl3mʩ'_ UZoͺ_jAZqYJ¹B㕘)-`y1B %Ԑ%:Hfvoòۅz©&B!مɈD2Zs?8D 7RI[2GOͻ<[$_qYXxFûCIǨB ` $w[ra%d16 WG395ʨl\Z)rh&2xmuapRSxU6~ZduP9ˣh\K5 yhq%$7R'_m UnQ&DtLIQ ˀ5XgQV?fJcjJtj$<Ei1cdVNsmhx2=+97M։?GpcT+փYa (m!Ă3( -0>"@ &VBFՐoC~,c7 c[ FɂU3655|&Vѡ܍/tZ_wqK,Li56/ К0IN%AZgmNd}#c_jN$J樿4} ̈́goZ"A;H|ozŮnTaK_4ӆܖD bFc_b~hfNHn u ?uƽh*^8`*ժۃ֕kC4)Ilt=8ir8pLڥ Dr") i 2pUr`FQRJ AvV]ƨ8xQʿ6#2=M^{4yikW6!E67@ `J28 )w!R+TѶѶ| =g9x D 9Ŝx2q 1.ϴ^&_d4-}|kK)^G&:/]a0є逽UNn 9_FB0B$XCљ$_g;a0.h1+Ap(=0vuS`Jƀ>U /$)KYg0V\*?e IS(w,&FO'D/pFPA1Dg g7{HKІB sǨO{Ǎ{ H)•0GT1uE26A &9.j̜ p./:ùإ=%'Kox?ۑ=c:1c, H'o֍5Z1ch8֙+3o<;l&"g1QvZf=IXWw[ReURt*qp{܌ίD1<_S0{POϭ1xW=U@6 c%yz76Ηa3_&bJ:{ЭbqV@DM=gK/twd?Jw跟agō{-^h.bZ gܓX8oh5Iyl*uu/9WTg#K?;8v5Ox#W&l:5pcWsⷫrQ}\(x~lp*2BfٟϏ|=T_Oe/}qZ`SJ^zqJ6vSfQ: E5'viG_T6jAie݁VGsʴ^B_.E>KQ=}OY\'LP+;M&NbUV+EpE& qсP:j(DA@+ TAi):'ŕmjU\v|v>a0{&SJ3Jn݆x=DZlp F!@ J'+tpѡ#LSŽ I%g6e`DY*xx@I_R,^%n-T#BnֲwhTW׷1T\7ֻk'ua}~x\Oe /8Phk|Fh6Эr+1.7$K7+SeT8KtiDI"Jy%Q$*'0:j2 (zd+B[4Bj;<M;@Wg%EZ{2ODR y8klv/g Eq]DTϜW҇ܡ&ZAܹ&A`z+U_Բ/y|KJˁ*y%A%9b TJ6fp8vkXMdFpya-Kai"+i)r0 X+ ufKYJ &!pPLس1X͠XFK / 7) 5*`$4sȡ+C uF0`T +%dZH5>R(E4Ӎ+H7VM7z`+…N &@Z qר"ا篤_J6R7&ܳ6Fٸ#bl Tߡ4NqOя>FWew1=(`t U8:'OqV2U*Cm8 5l1 ll+Zy {NLO55c`>Ǡ!gA8+mtm|8SH'LQK瘇1=>FzϵўVAm\[{ҮHks5-;n3.PSAs:W4Q jmGGΓݏ8$OWRA5GFTH"UH֧ꏙpĵs?6wp??_bo/1$>=[ń6`%P'sO}'{!NvZAdzCjJ!Ê|ӇÀ3=; hi\,%ZVv|0˅ PEX_r%k[ TF]m,}nL1ځPYLgKZAs4ag, *So=O?/~%uWϗnѦWo­, MXy۰(Nx2hRL_ɠܨ /Ba||Ji5XTi-0g?tm$|Εf[&A+JN`f65@)fjpyx ZYN?up<~wNOwezj廒Ui)aT^h*DPfɀ$MUVKhw٣$տ{[H҉} ѥv"O_Z`Z ֩ϻ55wVkzRsLn| !ej\3[m v:a (^z8_`.TO琰5߳/rzT<=܍p2^wj[N)R*-KY=A8_R3ׅYK-7cw-5rJ{/PR"s&n][?=zztZag/wz<(S?|zd Bd~-\BoC+l0SU١+>eG}9V*N.٫oCW[O Z,䅛hMi#l-Iwo3;Tμ[zEHքpM)^9IĮwST[*1GTWf[eHքpM)Fn*=zT bL':ޭ=N+'m{mޭ y&zMqBIț`:3:NcN|>k;[gKu^Zfvtv(1tE08I(9dQ*8V;kLj )Q4\U<(܏z,U?#9/ =e z qF]oH _|(%:b רJ99_cmj%~/=Y\x_me[Ňt= o@Ӭq .ixldBiӦ(lAj܎_|K%Ѳi}c {ЖTkqDRmsD:cdݭs Q\3>z>do] IQ,>tĨ.hXL/߷ҼrG4~X1jQϛOFk(5N_8t ß(F0?Q:-~+3]2Նjj4x k5p@gh)-|{; Uk(n-tq[~߾2ɓ;?l)&1XahpEWpXLk %-@<ύ+2tDf9vmZ!9f=J5m`M b dM.êT­wIVj*{0Ëؾ6wƯ>ǀ!'̀8"zfCvhg{gEWqQ8#,?~jGHD&ĚU뼺tǜsfΙ?ݹ)ʼn5z9ƕauRoB~R}WXBۈWj([bi߮IC2z4 Pd斐`t;%qC⬮bZ0%}z|o;:wSjmjL _Vl ,䅛hMq1xݪ=zT bL':ޭyVW8"[Mlq_QQ`aԞ#l<ڒJcz=[it%5^9׼z}-ݜ*$aݬ ʹpZ4:ܓ߅o_p !PٖHP(V͍9]0OW y$N]ʿ F`w)`DTye?=٨WDUF:v;UMlۡzKۉO9`Ю{ mR!JGj6$3tEPpǡ.FGxc BUpAǔ:k@UZ9ό}B-%hf\,3 {D[v[5(7r P}7Үg {CTw/tfi=dYu:>\vWa4["1*Aj5Nz+9-:m-L|/o3tFkW%!B" Y`%4oנl.mkfT)@թc A&L_TzEhM0Р\Y@^0$\FA( 9WBtDTf`{(^f+Ds;yri<n&NF.Ad7T֎^;o!ad7/  `*;//ƴˈhJŒy؈LLۭcc=eu&LEUqƆ: crr25$aЄ9p:!.wIXu"iouZMыb,]/kE%;9~fPùsfG) I/ 8P3#ԗa%pP9Z>y ҕkg░kMr/AIØŘ,^XFUT$/u,BB2Tn9˪~AX Nb3^^<E8،;~Xd$<~rF꒰A\'3*ͥr:d`h^9egx'dRk%%. +4 ;Q;2glVh[;1s[dkZVxkdRƢٺռTNp>䎛X}zB.ɮ4POvl? bZo߼I؏{c3;P}-nz2nU)/&<>.2 RC^iU]lRL\:꿒+75#Lr0ߊI"R1,!9TkR(Edysb* %7L y,ˈ

rQcr~Ĺ৅P} .Od4G#:|w`OVЛ_UM|3Oq@aw.Κ;jTL?>~JO;v:aQ?; d>< >jD0k4K\y7qO8і8!.+Leǧ ThXDS BNw5z>.*Q\P Nn{PiO_F5R:FK<=@z4l F  T)lH9UQBI2%%m)W,ppPׅ)';| % 9=rRNz 9]A@ a#V !Ck$qi|TÇ,zP)v9P5ηU*/,O(Wk5G1Fǝe 0&zaวQ.(U:? TGj:Q~?eiv˭^.4Ws*d\ lv4~9 %-o|]EbD̏2ǹS9ZPvVZ>9-N%zHQ|3R%&sVPkVhtlkslG!:h}+f jBU~%XN2 ة4o)I <#mYɌZHG9NChD:IpGChM2܇/mOۮ&~9Nz&{19_~!j\&:Ǥh naob&{{Xir+]Zp ŵqhuа _<On| kh 딏/L(1dȥeCUQՉoYY(*ɏ-51z\E=|omǘVDkTP#9k[&us-;>ջ6'FCf~\g[-!ХtP[u{PPZ+oiʛ<#NI:M5&h=JspTnB GV9Li+ڄCiww0 Zh:v3g>))3!jڨ|}sFv|[/=!q-$zɠ}Kp1x9y?7djH­hۑR*Vl]܄s]L ſbD@8ZC`5e>|Ɠq*{qׂh2weqUdinįU>ǐ!'Lx {fI("ҔD[ϕv nVA ֻ 5-gzɑ_2`}P`;홇݁A2ɲe-X_2%K#-2JWpH%_hk44z9vOlqg׍#&GǮ|G:QǴqaJ1fvYn'/18FxN{( g@qm5N>a~i;wJBԵ?EsObk4Դ#dTs<~0H)Ƥ3 7G2 BNY֛M@[WLxwL={ѝ7?D$pd{˴3XwK! *[q&YX%$9B5$X6 O{cP7i5/r7( RA+ܻX?#Ci y|]Wmf!n'{Re˔)m ;CB{{o)w⇠?>'r:* 5VTtN6I 5e;5 ( U0xИa^ЗJbLJ:Kj\T/oӇi7P/%}|.>Wu:退Ǩ\Zԓ| L mQux,=ULOz#uQ+K&0˻SDR:!{̚O3ɇp_zB&9_>qd8jO*yHjg{$h?TNsW|*6NCEJ{v75ДR³ r( \=V]L!f1jJ!?,%NM !X4Dž_ g3ufSDsu4Y{=~QaH=}M;H*x=ሜ -qCE%t!!@R wJ @V-P@#X GLZF񽮎orr!],hX5ox]%z!c0,_pg6(bC6@\/ZNfb{iu ݝ|.RrH{N4h/|E3H-v fxrg/%ܕ QYx3P RTB\yy6mkv̫팫W$t8EF/Ȑ2"?9Z!"( ) po\۲xP!h' ~<q"MQP!;|:\wBogP; 4} %KyO~2mۘh26V(X _ |;C u{2̿,YQz>!NGyx-1$4!EDo9 oje6 :Έ{tFNBGa|:ߥÐ3oQYxMBLs`|@CH)gh?9Bku '@jD1Q.d0 $}P2|aUfZS,Y9gh@6ǰUVLፁzn} 8#G孄"j/McM%O|󴰓i𕷅7 D;j߫w~nde'OT[oq!IapVi1w2лʤZ5W f3^XG`AVVhuae G"u%|a˧F3WC@8Ri ]d<qTP3P@dz3?7'U$ _RؒRHΠ\HS"K`T 7܀-hd˵}%[`ț#)nc>Us=_ SJDz8Fla}r_{nhcԐs1rV`H @YbS qGMvب{z)ޫ+_P+\Q]!Q ,4dN}dOV!#SaC*-cr4mIF8F+lHL  H=m{.n͹_D#`!kR_¦ I FB@92hK-Bc(C $Ci{PjAIlgz$n2A˳|QMCZ˃νҹzx{0Mwm2Tb.4;yϟ$|v7#'lQ'B;Lfb]ܭ/81p X ) Ŝi2Gm;Lw772(8i-ǖiXw[CUI9Ƣ];cշPQNp9ׂn54vZ0@w?u!X͇MIH%%^J.J!JCOCPe^ 9 Q{%V#$) 3g/;rtEe ǰ1m༔RXit `uK &T;Khu <&\&2Lvxx4`Ћ~_**tiD.+G*?~>|0ɏd&]馡7"/4smG[]=~i>^UWx?}o>0O^PWO 緤ZzͰS[T\TʠbZWCL# TR0n?" gJx]s1"YFi@ɖ5;H0%A\0K_yVF*WroڛQ00-@ue+)F%W%ԯz\!jfh`ORF˖oJoү^:$--(ix:,A(B hjO_H(@ӸJ..NG鶖-C݆/+|_ /톌3;χd\N??xO(H[ڦRÉBg^%..K̋K;{NUG2+$%{rYhoJ}C z.E$\nf~3KL=^k'\b':~> v{ jH9@K1B,_<ƭ?HȞ>rF1<{ɕTzB!!c BR/#LZkTҘ,c#p^+⾌@CV*up/1lj=C.>U}h :i(F 1l8[n !:vSwSQOE.擵qOcFBlŎiZKw1WoFƘ!ܙ o> @߷1'hC)<܁[aq|[s\c ":?Mfrpl~hTS滶9 f |Aܷcw `Y}'@p8T-TrJ^vyEd\ʐgH'Aaij'!+[} hs+ 삩:\QWIZ.S .dB/xbQr' hbocfm׀Jt$S B܎B ~]j5,KA,YI#)eDWvSGKF,G(5 .ۅ ti"b|+ @3@Ca%B E22g,IA TRfõ,40MQ!^X9駩UdRIaQeX-k`èZ*%i}Cq;XKGUν9VK=>[=+ A,o '@&Ѥ`L@RrIXnXtkN:7dݿ#5νTI@k􅢌f^!|tYi~;_!O׋9*EGFvlk)$ܯSBIOě)'C@c|g@sT&}xT6x5 a 95rF4QƋj]#\fV>yzw'8Z6Vwnlo6M߄8ffvK&`y1oyΉ`rl o|D fNOhԾ= kǛ E`)Xa3;YE$⤏T@4>A`HEY\ L!.xqnJe«~5f Ƥ.e5v\ )!Ƽ=~LKBwiQfP*DebWjT΍Ms`ٯ(C=^ŲǭƼmKz~[vmz$‚ǟn?ƣWi1r5}'PqJp-&bV!uOb I Y\M Q CP.k'Ơ$C!3+8J_©+&{U/v gqŐ6 /A4TIZd!32 HY8jN9"b*УЖ$ +vcJIc @KZ- V#!nA޵6n$"BVW0OY ]KA8cǞHr.Ք/EM ,YYF=섰㧒zJtF`EI8[c#!Hnc>Xi4~fQT%u1r|4O\"*36R\0#%y+臘H\??CV`N:ϥd?`*5g3ae%'U;A@l4}m{`6'V*M) t*ny!Y*QEV4KcU=2{vU>RkeNݸK-W!W\?tugٹ/۫gf}s᤯j/ßrw;Q"^g ^|Nt_&``GYmo.㻛ˋ[>1oo.r.[d변m 2_>#)r i[O)ǖAA3M0)&%Mos'sZG$h!@ 8g pcRd[MeƍSm'RkzD>,fE~_K6LYhMYZ,ޝ-"GH}7շu~L@e\_o{cY3.v?/Vm<^e+,\Y Ɵ u} NX"bԊ|L 1b~M펩o)cZ5T\0MMY=bûf_#zN{tnGH [-nCX 7m  3LT1~C,)xn֠Tr{IZSStM߇!W2k-P=-Sȏ~Q"S_dwX'L23ͽ;xoʅEƄ㑚.[О0T8͑F? 9Yx{GEo/}{ Nꀔ&Ίlx8}tZ*¿tVMϧ]  )dio-BYk@{)Hؿσo$zeK=!M>׹gAN^ 1|u4Au{t-Vl}XH`dcsΡTs j`pl_b7%e.>UB$4I2P)fc*s{*1hcN%26;\8 +`Iㄝ,TT!تJ@_ %%Y@m|RA2n2lZKЊ)de50*7cGkg{~@xy{@;4mBԻ-#GcGx4+$$$/P'a&\' DBoIhĎuc۠`i`;6cVuBo)Y))ž5ͿRcD%0ŐisH4vșVm҆@#30cBYͮ %#lɰ5 s=HbYw/asV٩ =H !rG\-gqB:: kL{R&`M0 0NS {jUhlPHO0 (&쫏O˛պfhNެ/'f?}Z,߾yC+,_o&\?ߐsɇl!y~4,v{'z],?@gv΢6څ<7ffIps .B "515Fga w<#! a{7iQJ+ 73s'e׊º}=fGpoM>+p׀ZDC#9m( ыt~gb9MUo%2ܕVHF)=nUѴu{FwPs/aa)HX]S3S5B 4`'/&2e> GAPh`).d|Ye>_ɊȎ(,< a-*<)  m0'bUKSPg>`>]=mWͮ([~q!yhfu^~/)#ۀoKױQpmzCx.ge:JauIH2OzͯbMlj*TNR|sϓN 5Vô⃪֑. ZIQ--w c^8k8a I<᱓ezZ9wQyWf4ftW"ߛڠz^s .i1#Xt+6cs'D[5YA4>A UNEypL~2,lhON>!h}.-6Ƥm{o^?ʓKgwI%c42.hVrFNNeSSťb FQ'K,AM OI}>{o# zA&L/Rtz7xU;+ uȒTxM;Ҍ c  FG'6ޚ'> 6yoH #ŠY=o N ?yUw y&æ$ߒb1Fnv{M$u`g?Twqv,:CAp4+NgI_@G EX?,Tb@1;ZAjb[,-`XyJjQJp2'YFOYK册e< QK4؉+jx%;AIO4)dx-)T8^j}Xލ޽([7;AюMKtHچ6SĶI64˰Rʹ!'돣8ƸSG%0oDŽ!Z}oߞr|9|ʦM{s Ķ<\qKSvsKSt6["ƹbNB5f4ujኃkЖaL1:v#]Q7gJpc<,ƞ{ij[w75]:\(8A[9\}.P$rB!'5ƅgƕ"xF 6*҃kEq(9Z(@{c2רO/̞<1?T>hHƱukpe"w7#٬\m'4lUn*J;. &Sq$s*%T1Y!:Df!8OtD *`OϾE >=a˼VHTE{1%).quOe9Pqv<m7B q=i/M5<GS>K ^:i~P'ETAX*hQGqbCi$Ę<(Wkgل3l`ȼv_DL 67N}zHJ;.w)Fh VW^Z(+#[\I,=63=vk59A,`8s2prmf7i ʐ4"yV[4tr+⹊!GY-5r~8Hm+%~knh >u ,@2 (4,sKB[&bd7-,^r2M+hRF>/ݓfXjXx_S  O'ξom|F׌*^ pca: B9pL׺J=4ȯwh>~5 Ѓ;c+/w, q_w[ЖlSNhevXKFƺ_=uL? l詑雟f:ڭ6B)ϊ5m1޿¹lV,"sT3wCNtӇtw]"zgC&x%lP T!%Rz#}93{~j߲_,k NPS37lשZ>}}.bmhYYHKDieh)~nQ1'P &F:E!h%@ UBEԎL#C iٖ& LJ+BL9{.)fX%YhsNiaBυbyݲtM N)с(CMZIPT;-X1CU+.$䐟-ڊ1*kDF0G,r5DQh Z4,Qn؟,DK R/K u:W;(NJ2Rn)+&-^\Cg'inh 0f'6-;@C;.2(=j-m#r-}#L-}#?]nFA/u` /Um%Hrw2)RMIX$i&AnQZuٵk^t#njYK(+.R7+4_<3337FQͲO':YЍطު(9츑YvY@R*Ztja0'3⶧ˊ[`$*"0Q3-L0;9|qP㡛J*`|rTAgqf7XPSI8,>5iGZ ~jD$xl{hDZ_RDe, ZBJVt BH3^2hBPvEytU14pvo 4?jy@>Su5y I8~Bc#lVx?PF裇tҔgG cek9VW^.A8NZ.v }]k偎-p;ӊB w-`-}h{yu״!wxpЎp2ݝY{]VHW QO|g[@cGo٘QQ|^Do hjjՁ?f9c- gubKΣƖXho7V|ŭG 0WQ Mn'1 AoH}B{A!D #߾yY~|A%7vz8#[Luq~pWL"$Cvk88VjG$m>-ߗȦemtO)[#,Q6/}y~B$?KFY)^tMp3ɩ ύK6^Am; RWs"?B| Gk zm.j0Wt7O2?cx4.T>rEaԼd]6ܢ}v @Ӗi^7Kn1NIk \AcNN<, .KE:8bVә5K_ e`щ,QU4Є  Dc\{\rA1I Heī$mծ@ߣKx'ZXm `ɉ6qZjms@X<&"*D*p33!ez̫Xs'Q!"ܑx/ޟ!vs9꫋ >׻⢤/ KkH蹂YaQIma.^Ջ,Yjۙ*InE__A=QzmeZC#_˔Ӧ_cPcky=RSAj{vfyT~{uGCn)a:(vpb50vn*bmrRT(y/eD2,NLR&Q3ӆO+ܿù-DQ+fZQy8e=D+6(iҊ"= .u @ IBv=9hQ]T?fe _b~O MR톝H@g;:N )`ied(`h٤p4vdہ.~hD+C_` kUC= m[IU %hlE kQLu(6F) Wٖ9ƺ *X|wFNJBV8W&MPT]_t +ed1 L!c @S]2N)z}ʛ9D2e}͠S6)Xpɔz6|)kycVy.a{AmZU_0>A>Ƙ `n%2xlAȱo.D !;qec1Sp, :.3wWpUO N;^L4Hc4Nl7TP I0HsZ`'UplSEjA$mJ6Rmєd܀i< T۲թ6kmˀږQ#um Ǚٶa&t<},F*\o0j]z¥^ ҹy^a5{l GC: %Ô O1+` 9Oϓr-ޟ_6V$>D[pơ(墎^r 9){)aJ %/* dA.J*aړgcF|g` ٱ W8(FK@PK DԩS˖x{d2J4I#DZz&RZҮu RT)}B~Js+oEdZqaVCp@=72uB'#*~c%Kc+)c8`mO,OIDJ"$ "1Q,J,cNSy tGTiqPbf Nɔ-Mr I5ϸpg qKC8ͧ1#/wt*-瘉^:zۧ8Oh@LʩA di;m÷f>Jbb7*i?W\ya'̳01BQxi0Hu:4a@_= cH8)pe(S)0 L ACPS eL lmR@ɷ񏷯F@ TDeL8i"NfX m^l_ X*' >&6 }|,x+;9gqkIB4kٖSh᫊GrSՒIr|g1ON3=8G!hN%fUCc8`;E&&Hď _>Xc+9橷׿b睾_ cofzmZ3^?rM5 cz׳ezg^DPM'Vlkj`ZQUg@z.GsiaNkK"e$IeJ+cOJK-Oln:bnla S$"*kɐ:%"HK "TEHL R%iFZROg*,>oa1/jq M%"Tz+>MzC)e* zRlmar7NLjᔁRΦ f<+#JJKhQEHIq"@a.d5]$ `YZV4]喕8|TpDƘ.sF8.⏲%>oMjqFw/nZ R$#;n<00@hU]ǨNqP,- uNKB!gvOc/xFrz J^1CrEn]9}rV<(H* ҁkN[`U!cM,/`,Fv ̡y`bԄ9J {u砦Љ 31]taRnK1E9C6m%G?u% ʤUBsyא/S;EwS*F#׶Δwa6G7N~=Ӷ} eo1it1|w|M~cyu:c ǂ7!N8By'^5%BNa~E} ȟA`us}2Ӫ.K{ X!@0ɽf8߮u`$&?(+B|M֡Q6}G6s!՗3`.;cjO]H==&). O#J,Wj~{⥱i5{\_83>Hsx3y-k;\˛ RIHU$JF"a2,N 1TƙC3˗gW  1Gf2 Jկ; &DsMJ}K}߀Y1P_mz5JfOesm&wir41 er%k%p3(j"5 TC"`,KUٿP`dbR2/ }ۧ͌23>:oxg[⎸ˁ;rh\v|"Wa6/8eim6oNɴHQ:u`y[LO\}y]ǹz1f~sճiOlopa@ɣߣ?.xk?qaƍmz3G|o3Ƿs)7}~fn>Q w{;{\sl+ÛAB !2oq=U7eP)*lN~bv[袱YߋV %y,MC-gR2=wW/,78U,wy@]Tq}3vaV $YWi,ڟܠ0=.~V,`%_Vōt@H(fAx01T R (VY!> g2xN IBq7?gyQ7܄,m0cIa^lRk-aGn{{k 86;P "\J6{5Z_i{5*ओgb)<Ѩh ;Nkc1 j= T`^!wSS e_wsF6|&+a:l]ve)NRAw)UNp3͗ʙͧ3ir&é31:CTC*3}]jMO퇸 ڤ_LYbCpPrŭ &&X%fL~v.t] kxr[C6.'nƇEX~-Wɴ]y죏&mj>W 򄣵0(8x=B rf ̩z;4A1b¥sP(9RƏ^p?+c&-ʹ[t&x0҂CL0f)V+N v WnpC.7+pFm>֍N|uW^`!3c̋:m,عKD+УJ+vܝ+TCv 5E{B\Nפ4TGbsGb}TՏwǏIP^ JʫFe0^.!bq);rd':eOUJql{?#jo*,_pޣn请HzޢzWAM O?6,WdWO??Zh%j'٢(XSRmX`@DF 42F2mu02D7y鈎a[X.٧kmCVX OP/ /olTb&lļR|]R:SjD>eR^Vł\!z1Vzi=8r EB/LXoUeN(4 dJ"sWx=(ө k ߁Y,ޤ)PSr"æTƭPZc- pid-k}(_ x)PwCVԇƦжI<HG߯FQi> m(rjZGأ2H&g(vFKǥLB#sŢv? l!Vwjngu@B:;nہi+JiA͡8Je`z%T=X{Bm0`4:l Yԍ"<L1Yw68f\13`9LJX'[ b[K|鵀R@ᚚ9`"rYq0rG/'556WHR!t7ԑ?! ֧='^C׸- ȶR &3dBJ !pEG(L& ML)$?Vվgau7mlVOM7($Րe:zq[fKv\|)V&Z;s 7w/CD0 + WRR0auqi1USR_?Oߒb^_\%At/l5 ˎO/q$]iNtT Hо=ig!AHܪ;-Wvz<}庮ƉhHJ$Kp& 0<(ŚjAJX$eF]0l(YE"K"\%G px<(K.;k:Iq玆ZV /P;RRZs[ɤQ)BeɂJ0 kt ;K( V:"60U\en!U\Fv|D)!9Hn;g ՘J^Үo{|ZS^ ϦV7J+4X OV:7Kog{om |-o҇O>/s|[nd9C\iT{%eӃz\Z& RhLiy? ö{3)Řpz&=s]1E5I8Rݿp% #"If9U 0S#M9X8,v$_)6D*1j X!! vJ/D*UlOOVd@^<# $S ǥn з}[T!8,LtP '28i&)GXh1(52Iͤ.%GTBX$4 #\: ,O` k%tOMyT'c"sdwϓrMefnDcy dzuo{_OVnє`9ⷑ5men!:y Ha jȗѫ[<愃}XR0.% +k#M{0s44~ NJ r.\RՄ 48ʴҖm$j:XU^(s8=^>Aw/CivOU>} a5hRV<$j*Ra-PTH:b0%N%\9T'BX5DR=P1VݣC꽐<R1&m^ D4x[Zs*:'%lAɦǦ "9H(B9 6}LyD`'@dޣz>ToM/i$=Ѣ&w}!so0b oNW'[d䆒'pYُi&axtΚNM\$`;dթPd,Ϫ~xVl<^|4[lٙ\? 6W\yq->n'x}6q)vړE0 3D÷6zLp1\hc:!Td8ƀ\9-\ۖ+'}81`#FG92B=3 [Y8 U1zb ˾2eg1+9OK^5prc~aeoʑTp_מE77ԩ+Z}o]Iڒ|EYvvX-W$翪+XSƻ,y}JŪ J<ܵ+8ݕmx @%M&a1T},y"zDY}KVgɭyKϙt5b<`|b}))l*- k8R$8\H 5bhg{1l'*JO^3kHm}Gw?+o5fփ|Gj**SudC3OCX[EcTD(aE>՗I)Yʷ9XԬMfYXvsGZtXFz? )j>1\rwAUS[=YAĶ^ٻHnWzٙ*藵51vx [jI#h$SѰ VeOH aq~_ rsN/,p^~J{0 (W7e=iW*ǿGZ[7uAԾĺMwL5ͺwɴn}hW*2=]9=V˃թ}Gu!u+SshV=\ևqҩ̔jaR w^vN&u2ؤ=7'׷7?zI (LOO~gg)?[{cMl/gvスN4g3E*7CS{MV= V}%Z><]VT" d]HduDtjj)"ZFH  #!-\;/-# ߦ:s} MTƅ#0&8RLh-wN2V) 15ċ @/YpoId+|gFEFYBE.KjDA5<ߥqsЌ'W&BY)qq.][f}AW O7◙8?t4N'IrNr[2:+СsN7x_ipvFK?he٧`/R' ws9byfBt=]:/u55L6.O]ݎ~CBɱx=M{YhEj"^:44 lfr/<*ٔybnd)]I"*&on`szmb@q] zwKj=` xSsPKnUf+Zl;0U a :6Y#z v։(^Y׊aT'*]rd ;^$]&KD{^XRݪujERy7aŲf ټ۫L!!n:1} ^QΆn7DK%1`6DN}HW@ju?u9j❥7i0D`42Ҁ)G'ZBhxRTSsJY`p6/سxUкZn_i;%p'2B>M#V2t[qpV5J\ῆ(n ^zFF3Teh  Zï`(q*x%̈́$ Ϭ"C/w5VssK!sɥuFwdh͘V{I!BZuLk\N 5Q4q5l6*a6P|[`4J. 76P6Pk(#Ȭ͎HB@٨t^(|a4;s.i$$.֬ ԣz,ٻReez,yJKcxEsDӞ+MUff GrF:b]ZMxmT*oZW>e UT ^ݚdׯF.dRV&Q#[mƓ m uuJjbLA2GEV E81ٵaEE !Йik}YVNS̈́0uwݩB23LӫA,y֔_TiaOYM*Qj2d:f߫j!UUJq۵n[-rT%m:oe*Z֭>-Ӻ!o\EtJ8wVɃEۄ`oCf5'VLr_A^7d+hDXP`Rt{sЊ1PlBz6-Æ}d X:΁O X%x:F[sΈy?3N P3 MfY,p ʅjv[Zkn2s^ץV Υ#˨D"50λlFVp?DsB|AH!~CiIG_ǯۥ_3F+C`F4|LqY:9:t''?j7S ]\^&#YÕ l" p$Vb/\Crl5HBeq) ɹu}&=8y&?Ctrw}cr2ri?Ki~sk>%}.7#?D|!wzX쑳+r҈gWr?N|[k %u]S.VJW~X) At'V.TTqTu,jd*x7h]Fˍd">98T4! ) &x[7Χ|j`l Z s˜jYudJ̦N q)c[匶(C5 kʰ1e,e;HkٹmE&w}1=W̟ '5A;/.]Ʈ`TӝP9k<ӸY ""ԚG,2b)IL9Jװ^/bn5{&Ǐy;͟ p)]˫˝kA¡$Z"XeR_\Y2f 3`:+&WJ&rd2RqKy>ߦo~~켪&XV5BqZTZ2#!s qPըF'L.@H>nq 4mRF6ՙG sn9֩=T.n?7{5%"V\J n{[[*jJ.$`-Tsj)kjݦs%^@g{F#aof̤4$P~!ֵ9oz/T Ϩu,k6za/@a5 }ձuVM:|<ߢ K2l"w{U*X:ۿi;[OvvL&kvIA))ۈRŴjaQ() <1O/q͔STl;D#h x!u\4!(#^{ ! PYy j0'_5[Fl1KM0(&)ǤC {hjk)bbG-?ZԤjnn䡞"BH@dAhkLD F8"] w4hYlxV^PS)cQ57dASJ0VH2ZKGǽ1dжK\Tŧ(sF6fugѕ6%*i=߯@]ZQh nۇߟBjɿ _KW$#Y[#?'?t/nk|,H*B$7O>9p2OO.kC<1^kE$#8'kfpuJ$Ia',fއr~h"6.e) bn7xNTs^3y%Aaxd:B΅BTh; pijn{*WRj6e9wh>s4 f+5m(\Ǖa'= ֙F>Z\ 3KkRĉ.6 cYZIhŴ\W!9#bbڮziĺh6oZf >T4 egBi7:^sw.}^]@ّ,Ǎ.m_Lt}t窬\!j z4a9BqӺ:y:5\}TC+a <i&cZ;S-+koS10u^#X h?L4T T}m<H 9ߡZ&l]˃(i=ᖂ2m-ál;mА75:e:*X7@uAԾĺMzL=n٢֭ y*ZSktnBV˃թ}Gu2b4V_RiА7m ĺ%f'" 5(֙K+n1wb_3` T(hUHAp7H+T" JoHD3HOaHZ{"yʒJzs3vRo5Ԧ dtaZM$QW >F3 *7(%⛔cp?{WX eX=ujm932+ #)j> B zPf+,+AO|X>W喧 ^--kPO&~NO_p2q6J]!*..a=rqI42!@*X]sk*Q.i@,K1e I3<a}I[`Mi=B@ߺ[#X\ \H-;aI I{"vZsi +LVKK-m+ hV;r1ѱв wBXT 6Rޒ"}PY\\ +"bUќk7߂|/'2x6(AK_g>AC(< Q'1]{?GDu$jpP%dE _UU,"W<I/ԑ{T,߯Se_멐ڧdFn3q2tuwG?taRRO7E'!'LE~vYC}(6m)'Ϩx~lG w~^N~^m쩯5)tYFL0MaNɠ++o?ηP5}KfRV+%:+F:R(8Z7mʶY9 JQ}NhߴƲI3$^%-< A2^JJR&{¸hhAxUh1eQJqȋNWB5R^f9VHQ9OpĈ:eI'r&(&4URݩ6bkf'ΓϣDiH.+PGH^ynC DM!-$k4Ͼ;Fn;1*?JQ3$A`/wLZFFa!aNtڣzAk>軍uspFnX6sjz5)S?oYMwJF>zدחϗvRY #7rу mu!݂v΢?yb|s_)}&_9{IfM/]nyR~CvNr5&$jw ˯M:p,G[e] uRr_ޒʶR^m41vJX4{:RqQ7c]{M7 M. Ȅ8»O6KcH#7Zleӯ :WL&zez ryT\*S2NrBѭ|t;TceMVJDa0|! @2b{PdG_ w,2E c.!$gLĦ-;o3=X-hghΜnŎ6cBSzom3!9lvO\b݅،p|}-4>/6=܊p$c[hvn8?fsA]6Ќͯ}h8)ehw>Y'4 .ͦ]K&W1rU4Ŕ" 5 !Py#'~40}Hi;HSJ9iG!H7, *)q '&CdH}oa ,>6 S"c`Z`( %K0S # gΥ,.Q`>{Ry|C0:o&V:22N|C0eؗyV ttGkg5)eM7]7)WuskuT+X>n~[qSC!osQ}NvkXG+֬T5d ]mVa{ Яi_ejКIm"+v$ZL u gф}znaף%o;t1I<_h4Qr~·,L.v1a"%#21vrZwEwRBH15HZ'cK{p",t!&U DRB=&S7%'i^pTJV1b6[!JeI$! R/[G ׬R Oiv1<"lɢMdol𒉶PcTFK#%/9 %ۗLZFذdi8S͹npFvc.gQ[;t+ zHQBCs9Eb?@BlX>Ѷnp+ Ē!ibMDeg jˣzS3ᇯZK.D;@JX> \'bռ5Z?QŃj+F^p@kYdY 0/7},vO_db'dfe'eaeDCFH ›PH%U8*L $(1U^̚sUݏߟ7*!Z3bNZac(F*fE_/JP*{CA_EI'V+S~ 1aǸ9C,N3j:vS;2TTeC}':tu,AkY(I7 zȿ-{7nj[L!\ݑ si*?__]$n/RmOO>N+nYk),$Q_W rďuė|?`^lVZBխz}z*Nnɰl+Q4\^嶋xQ5` +[CmZO6b[rZHQ싛5HwӢ$;v<gP#jM`ulR5Me߫F{V0կ֩6VJuRV fKꌍ>E+ծJu>DuQ!#lRu8rhUvvm0j\}Ɠԩ83TեQϻ_~?ǐHڒa.t$jꡫUCGm ]>Ro-+l 2y2ڕ  rsV<| [GHկ֩6O;Ơjr-}Og{>dV\mhȰ#\ ;t0U`t x"#8hYa㗪[r?PdM}/Ÿ.]^t d'⮪QFeCؙ erڒ8v\ Y׵$b?>ǥp~_|<;knN?T.aSXQ2.w$|ۿ9C 97]*4xv?$Ǐ^%WSpŔ\,c4ݖg|.W4Wj[^$}5u,cbvnv Y:8 &sش mɒuC[b-%NM032[2+0%"l#.jk1dMXCs5:f5(ץωdBkl77Bn%(=MS;Ft8H hVƊv4b zF*wk>mW}cFhV"j=[$U_[$NQd{ps-O.Q6ejMfV]<-ŻOu[mcFhFa/*1 -Gz&:$9}<)ݹ(gs /z#^\ElxF <0Mz~J~Mmz5]j [|qޞYkRg|Jk;zjo7>RMG(~qӎY] >uJ;KAFdUr0կ֩6z 8?i=F+ s v&Ղ#[[hJ(RN 6O11@J><bB|٭W|@ f9aCq$PJ U8_ Gvj95x 9*L&&>-wW!bz b^.CvbTh];/W.b ̷0ԑ g:zS *NNm3߲{ SC+s]m c.DA̷0Ы: ;֊-&0-+L<(OrYOBzJm)UC V|c, 9{ 8_hI5 lSV Y)’v>#QN8ؒ' ƢUeړ1%TSֻfKDH/kISr P4W}{9!gpQmF5Dj3MAXW 4(Lt*dT"pJT awYw;ZEW('x +27:cJ,CLRN\콶:ڍ4 kQT,4)ѿE4,=zQ6콬K 38s*Cڝj/q&6*vY"iu 4lv zH^g@+$iۋl l΃q`%fq!FJS `ݨz䣆mS/kťQnI.75]D!(g>(Ί9Twv $arz2{-恽&99 l2R]1@eMA7A). ·~N5ǸIǝ8ċp!NȬd1k*X?+GSlc6L9ԫM0X=6ي=O0V5!\]P!:ϒXf:A)kG/ ,Uŋ_dppa$RO;\,=QܤDe[ 'T,Uqvk(OG(˗>E+U:BTO&Wpۧwb✬]cmT. F*)4CmJBWhΧnM!lSNC .IաFUѫC{i@r|}d4+]wþBi6zebSb0PYk&Na `+ԭG([Nh,\5}P^q;OыKKn}|˟7^¢Y=|Z=V4F%-F7ZNWZ{7 U2 Өk9x󢌀KH9y )XeFggP$g=mg( >mf2)4 Du~XC3.$;`J2fNӥ6giQXc*Q-YYKɁnxCLTE9LЫs͵𺠊]ZTVCE_ahP^G " פ9qJ'ZhTP+S5FKR @`#TeSEB}4!+@vbɔAL 0 ~׎}a=28- aK0`bk@z(BQFk.B͏*Z2Z$HK/z)&BennXO[Jɉ ۼ4*RjT(Dݩ]{) VSFqtCyW \c$[-ȺڷL&ٔX)Jp zIS0LrH9Mpq̞Ȕ MBk +>>ļ`nKFL03Hc:W$"$i39Iۧw8w`z8# i7 Ql] KkvǃvXn"!:%t;vC.'i7 ZHH9ܰVf I:L*Gt&QX4KA{'Loa9a +[ģ <ĸޖؙQ/tdE7hZ{su'?\iysq $*2U^%`2d:Ul$ĉrkT,_h-@^^<{(ɂ6d $EIH&řo\s)05@;āQzxM~9E-du%4řo<>.i(&(9/u*3z .4| w #evF(I3lfla=4<s>mD <1 Iz d5krSEC0*b'eL PF/`݀UnT[RȺ(լNUsBZ'a4F§ µ;+44yʌ0ypB)3H\n ya_\ ެF"Vi㌜em:0O*ʸUvG7 zc#Pl9]VlGŽ^~].fy,ڶM hnbc7M€wǢrk>7lf}0[z6[7ԫۇBh]SkjUJP9 a*1h*(QO"p_x~qsxZ=lv/oC7q w k޲}b>vaqk{`~ȩ-HL(/"yjTbJq6)K3Ӕj-+fY-%%gG"IWѾ3Fh3h!Sc'ÅscPП!yJkd qc %5`T- Yhe*hɸT(QcwMc+yu~4Ʒ1O6=Ri?Ef!~S@7Rm/10+87J%+iE+?rT!oƲr{ t(zМa#I * mSJa1 #pux:G`?q +.8f7"V~zisO/[ 7*U(ٵp`jTCKBן ̔k|42O $2S5aJ i33~;dguP SխL9RIn |]9;敩PXT-4!jWtQbbiWf41|a+կj)Ip(~K\g°LWEFu>…Z)Tb7Y6=RQ(lB-7&-.BNORvTkq= 05K@@-I % IJ3Nqѫ|,| ^19^y+FrF,/]ĝ;@#D=Kي~Xĝ4OԎ$HNrhU$\손NuJ#d)9IQŋ1x]9; 玡i|K\jF &}6~wWnOusxŇ$I2S V=$Ԙ1>j}T3~rw(5-Fmv)>r9zp_x >:D1!L$}N0 OFD`Ir++ %cr60@i4_y'uYT~s@0E*p^`)(y),R+*AG}}Es2g-'416/Ut3Mi~ӁF~x19 g/ѩO^la8r~Z@ZFQc (c`{c8k/Q |s,jɌoQx؇lv(6fWP#NP<= zXs-@kڔqKMFX4Тϴ 0oWnuU29vMX 'q4R958jU=G&9WW{E=Ǫ ,V5gn_V`F:9M{d(A9ٓ^ =6Hf=&"U:Z r7&ĹR 32=攉 gj :BA~v3@H%'AlΦf65T&3sZԵ9yQlifLo@ܤs_C?(*0*MdQe9buҪh̹4.Um~y~~ obd~  Z>=]]4߇)>LaJoOC!8lB0LYRc+,hA m~ruiMnq_fnz4׳{}?t%<._Y _oXp~]X< u#:;ؽ}zؔDbPd^b Kƥ^oi -pTJڿ5U-C4Xi(kGed殝U}+,9jdZqu+q ^Io YjXjN7 ^/Rz}`лv ^wmHJY`b|CfȒO=/WlRK )6[-G 0حVy*XS@c _bPׅ,RmWfOHnϭ{#P*Wޯ=l`Jpb Pad檒Km9Fj kJHK5Kք&5aLkɥ0&G, ʒ!NH"0%A0ԥwȫ{n:nczY­fFoO3Ptκl5mtQ'j RHe &hlPn܁_^`_7McߵNP2jA:O6$,]$*ڝ3."-/`EJ79:pIE]"nզNrAYneu}ٻ+;Vwy:I ŭrQ#K*Dm60e;7)7 inh͂q#~qx[}b zFO_3;PSoO aT+B:ƨoo}~u};h/@Q/넆 "<4vW)use9 w[{ >= K*AD}/?:i`t>t`jҴN0Fu=ǔ{| ϮggCYT;cXrN']@<(~N;C,1;^8H/ZO3D.a^ON۶Jnr2w3syHم@#D$.f^BdO '>GC0ZWBtqҧn$E@#$q"ZέHQLݸ(U+tI;}0fRRtx<:]CcNt:{')2}=뽜4}'o="qoE%PUQQiUK~ٺWE p>ٺlKbTQvI:ƩL)3>1|8o q[hIjӗZuAѪg@gl~Qx'q]x$܈<؇E1W7 ?[PluA2$ o2,lkb+X.h Z4KdXfrSFf 㓫K>Wn~Nzܹ77nqfq4Ea+_=}-if˱y9QyRWД_o9"G?φV[R3ښZdȎN"cgĈʱ No-4;;'d+:_>GBxijx!{e^}IL +}mHWVdKAn|--ļ=2COq-%Z+d‹A}oSI^_kzfkͲTϏ{^\NԌ9եI:sW%%suVZ! .^=TՏ$S~u?1{]َ7?ծ+Z"\Ø麺N8Iܢ^4ǗDjc7DPG`!(T40L6}i6jH\S][T{(27rC2=H "8`NF8KCGc(+E7P\KЈWJ - xwR%jt "2d+}>g㹝. }=̩d_#lOY3co+}֓ g2m35F۞fl8zi4w󃝃[p??b";%h~^1Fym>Q5*WqyLO~>)~{1].~WVou ~CԬMY/ܷ%}F5+lłrUWy;߼v4&ܴAM r|xiv='}3 xJKɐQD n:l-K^gV5D.nULL㲊þ]b4RH8Qpt5N*A(F)VoaiUQ'RTR]H{RIvhV>Tݿ =nrJ/ #:ƠOӛy *۷F}K!g,'9+hin\|9ގo>o]~zaϙ@SE3WȖMA e'1m*FK1vJX4"bp ~y1=K:=%Ǯtp?."N I/LutbtW,-5։'vsqQ^.N"hM3 m,1%Jޡ9.9YR)USVsd۹08#u RbεeڶsA $(Bm"bY؜3I D ;"Hn%GwUW_P\3%>~7C %=]!` uxտgKSCC#=nF`>"1ubbHZ;#ԯ7hs&J;,n=G3(ymzE P0ㄋM7^Fk5",B 1$Xvݍf8r)k`Ҕu* 'TlY{F viǑ!ՎA[Hn |P>Cr_~)3biD~,^;ggݹs,}m-WA|5oyQ;k9j5s|;d}  1|8fh'36xG&wɤ@-<4lȏ|9%Lmxjum9"ZjՑ.Z96M Ԯ`QԒduRUOJ/$TD]]߮QL.z:Al;ݦ\s%z1:qUL`LJp);`<`jLG_ߝMF2\b**Ό%9/Ę@PeFKaHˬH(}xA'xJals˒Y\[5ƅ. S\H)~l9Qbݚ`"GI(8=*{ؓna܅$B_|h%æ5kIE9w^n "hWKTr {NHz,%Je*twhdk98&j@vZja\`gdcy ݌Vl,5io3$gCBl&~5#R0Q\/mNǺ(,f:O`VlEPέ\XUrȉYT llaĶ$F•ߍ(Jn6Sm?~FwpTDqNV..:2Vp~WlfugƝJCD+b🪱h+\fb㰯mr'ކ(Q&@UDpa8-䂩 3*xHj`;~/ܻmaxgp "hϾoυy9{≢G$z=|uϫ_69!vgSk)ϊVXΏ &0I{]㏣/Hө?Wb`&fPgHm꯫'ÏQm(8k황JAjO̶6moZ樜yN<1EVa4re\HBy!4ɫ=aІ||3.t4azytƔ ,iB械\Kx EE95eT!|ŵ'z@ ÄpR%7LL !+bx.vl,Ur4!䃡{n0c˖E|- @Ke`fh;3)eQe3dRI@[LR9 rfPGaȕ.M7 *#wj K 97VO͵Zn=.uAPpX_Xx::PiX%7\v;d` V4BwK)'XSY>&<ЅjvJ+>̿%/_X*_#Ef',ڋl:ͧF.Hyҭw/^.S {+n܌I|_՞\GGE1W!_ v!lQ^Bw#Nq)[_4t)hxúÝ+_w.^ ;Gwz筵VgEMtA(ܙߧޤ"Xte:ȅNޠt#ݠ{_l$o4ciJQK=!{lT;8u!$]b=QgXN:];68Ᏽ?6͒Ph=&g3>µ T)%mX{^?:n'B͵ iEٻ+Le~[RpaPK"(JkN۔ 5ұ[=Xmr^ 쒫V_  KЌ'Źjo8vl}?:]#oXpJvP `4̟ӹc7o +5^pǧz'ю~wf]w%W'0ads gQlMX˃}Xg3}p􁺽g 6 6NKHs/bȜnRv|&yB@:'lfr͔0F>"Y%gaKl ƹv#2-. -g,S*ƾo#y;#Jo% EwO,*^H x  yx(3NH$G޵ƑbvGE 37bS**1Ɖm{vvߏTK͖v`|)WS%ȏNfxfFRHBJzuuSWQW)>IW xk#;sWԓ?;LbutcO7O۳fM&HUxtu7|ΰ˗}3.eQqV`+KJ0vj,L*vߔtivˤNT/јA*=hWip{tF |)@ ˒.Ry5qHamS! ~N8hƢ%Jklk&'.a/Jpi"(9᢭R8m!̕iH%Ze#Ѐ!Bi#* ^Jf'RI Vo؛X-C؞W@vM"k¼ÓbCbaov{d h Dyim7ݛ? @T/wh<֩%Gcppڒ4} H&l}$YIhs}z@;S'c@L8C\ kxl& Kw,iKDÇX"?)'EhMCLdӓkqCqkj,v!9gg`$V1}c[1վ|[m5L#fNDf}i1v+c|²+?|_oaU]a viM _| ru:ΕJ<) x{'-w5TųRK!$ v>UKSM4 fHNxSx7􁟑{R;L>]V<5Z}o/:RG%`gWO/hC%eP7̣B9a} (8q[JF|u'*}[Xʆ , ,> S˔yy+ˊP&IE{nL^Z&\EwI9xU$mY<{ja,GWݜLXPtpWat=6R&=HRh){./ kZiq}SVV#TU" p Q` V5TQuJOckm%'zY@ :t763`{.7EbSKԥLd\=]xIeJ@n&DqI ǔ/FJ*Em R:a-`-DUkaꚽSXyb çBQW1ZX4̒ezQˁ} FmixIFJ:V#yV}/>$GW%$/Uek ` aðmy_"(`6 }~<̒]r;RaJtip^ȈJP]jYWN+\Km[mB "Yc0D,mp.B(L_޿Y]*-!tz˛?4dwv{wy{Q9OC1e~vmAxd{_^]G!$O3ETbƥ!m<,)7%H||n)Q c";Z]sW3ʢ!z/כGOKa[Z8Ia끾mC'NbښK%OnefQI!QIan짧V 92 6]F\Pz v BȰw:uur֗E.CTYSq;\1`-+XU/ e$ߚֲIu.c?/}'F5*M?F -+Z>6X!RQ˩_W_“dB_/bQr Hko 8ϡJMc @2FNDuQ7qx LrM'hb#_zb;(\Jp9Io&%ɔ:pRV$+&RmpjR+>&0 V^Rq ooIvLBc-k$ZEYتPe]z'K'U5p(UQP \)s!@/s*ms{Dul[xcf;vC NqK]oș[)ף u1nh]Ii8Iz8nh- uCY%5bɩ0Dp0lA33xbf߷  ؝(s8C/~vjgk!068\:2ˁWts0)4,MI} Oq^(Ѻ4G?5L/ꏛa& HKp4a{Ӣ*|pH8qk8W1o}?\ʚym&bܮź ջowaGoF1:/.n͎kv_)l^ǭYȤZ_ݧ+5n{)h KsK2d,Ga K,B_~c]k6f=%ōNee +NeÖb"Q|.K.OŞ*Oyޢ/m 5~=c/UmöZ3dmۜ3׊)O>]@ϥsRΡCأ GRgau2К ^#qL؟tCp~CR}"Dfb+5}\Ul^hAnQ*H*];Y/|?![)l -q$hE WjӓsTdUdBi:NT}2ٺi1hz\JԨv*lKx*or @آۑO!O'Vmq>,{NRH$ϧL|e1E EP . LL.lE]<d~Vfwvϥ~F^H wjMkU)\עPIڗQR$A2*Bo UcŞ+[H֒zMyuӪ~W> ϼN~S~V)}9/.uq{q}ӊo w0VmbeM ҥR(*s9 86F)_bppSQEcaK~jf~fA`_zAb6Wb{$`}Zx/UpY5 MDz _V3Fy>LYD*0J%Xvaj+?dJ:K!Sa.H&P76@1R6 g֜aZQNAwKI!lm 1j-ڴS[r[&-BSYlyШ0\Va\pvU.?iYM>'ws<O}_}O8#PONJ oK߉0m*An>֭7 f酨ݨbY=g$|]( 0| I_+gU؇5gnR;]R#:, 4؉w[d.b.Ox)FCӺTNG72NGH7t<ɕTs&_B] R񖼣R=}cˊ,NOqefyT=}$D$@Q>0K!kV1BẔfH:B<g8N(j.*K[|r|G7kl3OZ?=T/@YeJ]gR9o.UV\G􃇯y9#J 0ů! x< ucު,S,{uZKkTמx=ŻKm=aGjn;_\~jjӘs0嗳xTCNQsݱrYk˛?4畿v{wy{Q?{Fv /U b9`2odw{4#^INN8}-jɲV/;@jŪbպ,t^U-)D=t)ۂ?ٲ߲z5b_Wk٦,(U;98w`m@G7.d=ԝdOqφݚ Dtv:(03Cv!!o\Dɔ8_j+G֔!u[^VxL5WbHѳe dY5НϿ u \|<ȑ*;Qh;u۠-"[pJn|}91aE` EE? <\$ /䋄iRHl9{WfZd"Q-p_hZ)D!9^6(+w>`oR߰^S'rr}. r\j|9[T{8,=YEI?ɧ-{[$2%t_!zi,1[?}{ׄp\joOpo6~>4Ūz'2>e~XTīR6#yY_/}'ϕ&&`SX'eBq:W7+ !;XгmċJ#Fa4oX w,l@j˴C|E q_v"h& $N`zC@w3C4gHGC\N8SլZxKRaRJqfj^ J)"+]͚VR4I$&A^a |ȄcK9_'ɬ$Ʉ$CddDI(HXۈƱb^W:-]s(.}_ErDm_5NC\O;Ʌ`o홂aC*Ǿ9Lj]ʊiaXurIrdzK]حhi*8ݩ{Uk̜1qQͿ34UTK h\f H JaFD%/i5SLWj}'! (l^;{~Rhn"qo˘jׅjOêQK(!nssw:>Z;Ky2Jվ ꃜϟ$++~w]e-F bSo=<+Le[b$Zcvz4 U-Fd+=ƶEfV؄l)k[MyǙW/M-̝V\b˩e>,_0Gk;v%&eZ6ɓw#[+sNH)/2\~&N< ? aΓW_J;U:`=|1/v`j86T]Y4OC~<@s oHMt v0[tky _ NA }oLMy >ϠUNX畺! ߈ JVI *UFPvU -*9K BMY5.U8*Ҙ u֠Pk1 *xZ h5`c4#*HIGEdZBnStjd_f]܆loAO ///]R 4iUK" V CW-"h(j-A8L;y` tKfc8@TV h*G/rLձUfSUPV\m1'tJiNPZmCTdʐd07 (}aЁjhUƫݲx!@YOYM93"Zj98 J%N3Jxp"L 6EeZH3~z㿮tY2i@C']J`J34'3҂O ?+* :s|*t](.ŗM+WznaA~4{6!G\|5+P,>,ߦwEk a"!<<>\_rAl:=:%#n|uNJ\G"ǵ"^]y|*uhP^ yJIE'Y?>aA(=I 0A9|[S-؜ﲛRشAH)0)-fBQJ/ZJ1/m Cb`;ޒjؐTHv [͋%ygW[.ޅEYQz$|Yn_g &;D/M´kBD&bwh/U-VMU会nSXS5'ZLr,H,uVÓ (@T ӉaD%֟rЛނ'Yun܉ Zn E Z؜W{FEihTe_ $NfQdeq8 f԰[4e)uqaBY&ϵb51R/qrԟ'=iŨLQԘrL +5ЅSa*%[=m7l/rNW7ݵ>-Oo~MϠC[bɈ%ߚR1_-( 'RZڍLW?N1p*8aLU/{y`}n~i|`K닣E Vw qzϏ?t.N?:V# aLJυlJ6s x*#Dfkg2)?w_̢nwޅn}񒁓~=qo^P-Z 2-V)Ij%(-՜њl)"LJ(Sӫ={cTS]0)ԥt7TQJ/ZJ!pV- wj'ԟB*M\ݕ7k_ISTzc<{a3)ЂQ4䒶_kZ<֔$g!U" +QF#c Pf,5y/h깃=x}mA^˖yo!ӭ2_\]"cUP: ˶|o?*lQt*@E/ijra$ 䒜PTt#%n˷d+mk, w; dEnD'Wv\`'ҡD`"MYbB1t[4r@ERӢU̡8@]ƕ F&H[{G[s!D&hص-a8ZqQ҄h+'}&eK^d}1(v:(~V`|V?66=5ۻvJpZEf:1Y!BXꇷZ i4CIwz1M9әxpBn2+3nW`JJ)mT%6,"m{}^-L}ThW.<}>8lc\ǧӉZH`?a7?=}~ٰ5"+?mzW4{x TWt\mq6o}O4xm}u@'Q+ __rM/# לJM~W8ĕqIx1H)(\_}5+17=8Jk[!=-Y?Q=-Q}O_?c iɏ$DaM0U }t/̻gI{6+|CM߇/u&94M7I'FV:(y$!M T]GwuUuWƁM`0\HQ!"RZVJf#px"Щ)D?<\pix!XoZ@S.Йu<g +ij6.L0Lbg돂20Hbzl?L:3/ZS(8ڍE-D͋jXƬΔQ 0,.Y(0:UJ0"a;X-MZ}k,JB: ;f@!ѺD|f} /,! -LQ: `˙qՅQg')Tv440F'MuRRk$=4}T~•>朤=E>mAIx˂(JN ^9ˬj=\ʦo&B]$]cTK·8⒊K"kR ?ײ<kY (F[Չ ˊ4[ǟ^&n|~:L[| .?p'w^yyr]Q,>`84"J`ncl! e3r A8LGH+&_=6?f| Iҽ z><1@5AvGP s?fYٽ+t9Iڝ=)Ga*sr_A?6 &آ% KRpU n6G*Wy$ƃ~cQl}6L̎X'#"kҗѦ֫W;koZ<./ jfM&9X i[\|'o9'cGWTiZWbo}.7^gZr(:THbUJ|H#8ơ8XE(yL؄ thAi6FF!"P  G9JB$`rT  & ZSHcZ㡖zU1EisD03!< 8") ;sXđFJ`KBa=HK8䡣b)eb P8f%ИS1"1+|אրWCG&FYYQh$)&NIM $HȆ ODSFz#6%%HوdT$}$GѦ=mh(0ZQ$7sXFz٫B msQ8zCo=@x?:409{+$+QbMj89m~<}*r _lɰ>\pd6',촁K{<=M :.mjIj+ZXQt{0BL=b,wV eJ$ P0e"bbC` vq0҂2|Xi:sqq2LL IgwXI$2M*„:BNV%eSuYg1AH#TEqTU3 6UArRVCrkqϑ  lrau;!?X77vs] ֞_Y(&a&Dӟ#l/)y}<}j܅3] ݸэϙ ]y"Ĭ[abVqFB*r:JS{ս/uܽϵSRwx[/ZΣjel sV=QWhVH p&2 )?'fL\]}뮻$uF3ys^s2v^?XN䢮("4b=*)69ݢ60n|0u*q ɎNz;0mRqeU:)ٶh|WMz kӫ6) h{tS˿Q5%BBw2"bn' 1|;D8@;."6Q5dd[\AW^ɿOW#"o7\m '8$*O\ufNb",\}ӆ:4< QVdE(|2_x6PZ.},dO>LMG3҇03a?=!a>Yei2ћI>Fh ш3C*Z y+Y<LTKݬ>&n;@\߮}~ qd)>f TXX-5JE Ӑ!OHϣX&.!`q=w .Al {GP^ k4&-My]Zñ2Tˎ 7Ͼ0 XHT礰 /bYZ3~10@1ZK/nJt̛ Hk!HME[bt 4 w8G-b\-(V=/O];EөAGU|ɮػ]y(QZdhv͒0BBf&E)X#&w\t9۴\9Z?VˆQ)=e9т9-Y+̷#%ϭE /TX\bJELcwiɒ:( MYȻ0MF37}X{@c ~xUo3<nQ[-&bed!%Ul|zl2Ծ&Ldjޙՠ%f-nX%n'KX0p=Ad_dX-\{ =WV!jI/"wlkֱ86m̔ \p. wr3C,~s|Xp l21q]!ULdbM]]3Fp94i(s髫ko0ҧ$I?p3<t6y_7x,N/G&:/dhm}4-?dhχ`#@rjfN'_ k}*WfdA.poǓEhׯv4oL?o1p#<1S-`y1O⥟$lۛ$e/B~$u?foP\hY͆Lv4|f묕O3q3{{?Exb>dZ` ~u`"Χ)} 0|^2Br3o0&;iy"݀ bBi+moۯM_hyêAI@ܦ_?<^80נ# jY.='?=S>4&sLl /iOŒJ;gԬ)O4ճNą+ve|ʿфzlgx7I6Ibg0MQ?,9{ڮ~._ٽVKGC̲0_m=LL YWŻD-`u4yE9oۉ%I"f)?xn< ˒@w^foUO5Ü_g>x>Yv4{2~mv1-(4}"uo7t:򥜗L&LAA| [>?+Pz7WՏy'׫w.|Ƌoihf4 v@%'.1!?%À ʧq ! 5(YIq¦ɟ/\o.e,iaȗ{ûZe?5jzMK Oy6@rՃepO0uW7SjQ婥ҧ]8{dw gA=0V_eY1Ug0dvE wiWX,"yƱ qܛsE x}cV7)鴣gB.ҞjD=gi(xx8;7( kƅD/s!FEDQ+޵5KjOBut^l%JrZ&/I0,iIj>1CÛ3$uQծE4n4ݍn͘bJc Aڢ4NkI  Ϥ1wݷpiCN^bOʻ&.k>|$("W󏝀*5k _nOOarHprՃ߇A?d:fK[N495_|{Lq>"Jh(HחW/0 xuD:w㠆2XEdVK~w.9Sh.Mo®\5 2!Ϸ1bROs8$nA mx40vgw Bm검! OPȒBe([SJQ{5EebZ>~Sܺ!3gmg"Fp ObԖg<v@ WޤA_j^wXpI{5(y+`$U'Ea/ih׽:'9C:q *:hz1,0հ'SƽuYe5#V+[qe /N1c  UlEuE~ ւQ"$~))ɂ@:űl cͲmG C.6p͓O/tE#t@WSh%-LȢdP*VǯqZ"a[.=F:XRjNq$Ё Bqhħ\UVVV{c-UaqVc9l4W(2'Ĉ#F<ը(Yƥ/K[+gʊL34zO "FԟWKvS 7k7jZr|Vh"Pj02 @5T#]MFh(M\|-pRNr>nFhiPg%,I>7BCJQ|푆0ZilG&RvlGbIu罁i㛛ّO3^]3Kh^wh%VO#fŗ5Hݹɩ"61ToUYͰtCb@U:Gp3澏l@bҷ %{%h7 J wʛx@}vH:x7lV'~7KR==/:SO:/!~v\Z~wd~N0S̀F΀8O0%Jҝ"eEZr! VT%fAJ (3qC/U!/9˹vi#%Hw[tFl^{ ޟ DքYJ1y^y$d(bŧ3(cJm7U!s nfk ۊ_op/w$o"9%DRX@4TJ(p<|_9UJV NO`]fZR@zyu,F9My[,X6C6ҐmpwCg=|n2? tY^'37P[9{f2=*rTkAa+/W?w.ՊHGLpv7g؂B-'Dâlشg&JdpP Wpg`oV{p| NmIi8²BSI[}-\Sg A ک*d88J+TV%3No e1 UIP}],N}hgaWg1l5I re+Ο0D+\UٝfC˟DuuCgH&:l KW?ۇ^e(gGK֨V_p͚[thzl/1hE=&д0mAjQp!W0aO$ d5a=eeXEFJx^/wqˆsڜmov2+ώ$䅋hL[XwU .vPb#:HnѾ[~EHօp͑)A[ڭ?}@햋A>#E𔦉i[Wn]H 2%FBi^E]rZ1$TrڑU'2zwm=w0!- 1)N?=:D\̞1yLe.A@ͧgy$qYIԇc2RsU{y29(Cмb i j IP_,Ɋ}=A+#l D" z(..CBl+SDdGU$ wOt(ӸreD L҈GL&'n#;zLaS*^8PT%+,LHpaRL!ӹ #GM`/qetjLJύ@"}}]_NBn'abVe@֙k^J4WSܕᝥk Jvk.k޸ yٟ_UJC~9xOx:gxToKw%뿆ke,>ɇOb[e2Mɔ)Yz<%dt9:6\\6pez҇3TTJVKd9(NnӅB/pBl,Hn)jJjƋ b_";=4Λ;4OM9Ļus7w<6Pv($\{7ykzRIcg峑`Q;͖tf'M5ǒvEN<#Ӻ#SL ҝ)PuY8jt=2]{V xQGZvp֗c.wY7E.#Um߾ym6XOAo ݌BˊGO'MD^155".6J΀VVcJq|RUfE_4W+ۄh%-:]2E)SXgóJ,,p t٬-c:0J$RՕw3+Kh YwVr_ o1oc~̴:;r"^77.ŸC1JbbMW s%pE Z LH"_~-QHI=4UCpÂr*@kxdl'F>||'M^Q0rc LSZ[9,?gp{K*AM@9,dkftЮQ&4!RL6=|*K|fv֭/RT;r۔(n'j݆Аo\E;҂k8=-R{RUgbE7BT$ 1Z64K>B;zԈ;319qeCY)&*q%]*!QKf['2G/DtA1yLuΰ]^_Xnggrv>ɴ6b{^V|/Apsk98Js+'(BPX;STZ '[ex@$ 4[okOOIk5ma 7i%^/mHN~Z:gy[ځ'"to|IFI6<}Gڎ'ޟ} Zk5oV7C,o>@nS]˲shqnm>0`GN$'t1F!"/PXY"Ƞ4K`G;G~#!E) 4i@V5K4:c >0LJUd*@pcIcwD5TD`\L>-edZ`2vDlu8)|g1Yp&O ?\u´_=J#xLɲMۥ14qh4tܼ{f OXd I1Ds\E&O$/`R.| ,!Lio\$c6c)I3PY!X4qPY?Ј\O=:< *})EjaACQ 7*XuCVB\eH{Om=D&80-cӧ̶oԝ;H=,ňHLLFꮞ^#uLp*RwE H,J GꎺXrY $_Î36H޳59+fĩə GPv#Zzhn0H].la *[""/!uSF؉5tLŞwpxf+1uMr:ɨ$ӭ N$YY!) 54AU42@I3=#V350`TN&[5DɼFD Nʒ ^k6k$K%W[1";4fm7m/z\ZFtgb2fzfЛS7+2ITJRud@oID9Vٯasa" =>ko;M1FώhЛ}D&y160Nآ,%rq*,ѳ;_ޤ zCF̡+T_<ڜ*yP!h@ h۔Z 2MKWT]KEF"h#1 5GX;Jꡪ.ISow,,ta1Dl(ղUQd;=YYֲm:%LAi'Nn` 0A>⿆Jî-b5ʶVSA WH#Ģ>FW#g?\[7e1CjFŭO8KkܣhHG_?' g;[PcflX$;T`%pHn'.+9m'%8YWns9q+das.@XB4Ls,q!@'ty{gtK$6KKv[IUVii'j+[;U k`;Gҗʊ0&QA@' rUp +u{A5j_d }~U^[Sj~d_V; k{;GgZ;`+lJ/r(4*^Z +,s:C i<+m_bFTV/˥-5_ " ་R}2'!-PM>8SA=Wئ%\KM[ёޛ0Rc ݏ16T+_OS*м4:cu Rz%2-˪|e$ pKEp;_!Ҵmۗ۟jobj:Ԯ`O/N\4L۟xX0Mίˬ?|~?|~PS-#݅泿_~`h6)hK[ޯ㗫+2뻾SA@V>}?zvAR2,gWOko׌x4+f4 n֬kxP]FG6}!}5{w j$Zhys`w2DXw7UYUri%-IuɗGT؅ryP^m Il #;O}Os@ȨEȽ#"wngq66Pz .w5զ-&wwYܼ}󦷷$y"6 .L6\А iXcc 84'Jd#c ca.teB OO+mS I+ *< R-/h'X==w1H,Jõ_"cS # Ć넌9uf|}6le_GZoÌ1}1c0}= RMf 裿iNv0ƱcwaTkjbB WMcvtTKc6=񚜞v}5Uٺo:ݺK!__nawwʯ]^B"ul 3v}P:OThDIڈ/+] sl7ɨ zM','(A`Da!P(5} x3:E'S*՗j@iB2В;pN\Xd -q ѸL&D>FگYpyb%"*A{7crXc\D=A+8rvW:R렅UV`ʤqjdfQ!n3ҸU4ҤqjXϘ@#i+PbZ ՊB㩅PrdpzT5R>DDCkOTRZ ^v{PI,4dLd\94ѤO )sG@wg9jeߝkoi\N apl% a$*58?heB~a<٧Ի&KrOzϮțv>caOLT/XJ `ˌr8 eGu"u\ Y)SX]Q B!FJ2μOx8s,L6°SSUDNnPK_bj0NZzZ*L Sk-6j\ICE *))RJ&M%cdk.ٻ8ndW~ vqzK/ ؃`y ٤]Yq:A!{ZROkFb7{de'fnօ"+8:0TT8-Dz {א w$'0T@O jgk4>2`DmSş}iR~gws~U  5BZ8o LfAm@ ZO4R8#ǃUs v”lFRZGnj8_>^1 D;ϫp㣮&V%Oc>Qg?G*rn.|JwP"Y.n^֜Ra^@fBkVN뤒,n^!7YCDk]wck@B<8=C( z>ԇRkmn@}=L&škDx1[z>)@"_,cK8܃rF8'˥pgbv vt•g)\iL5HH45y עeH (@װL b"5"z)fKRե y5Z,U>a!)-1b.lӚȔB~l8:Nvj`r9Sb"y2NviLhB^!} KYf"VLVnPX`8KMFxQ_[JSL0o j]'@1arx\4$!"1&,k'bU( 1!\}FFfbYx5ztRI  Yi7-nj:d/S6h^SXXSj,f_2#Z LuF5%yMR,Ș+ŕ33@E|SP elTv$g@d>p@R3AvfEPl.#%7=Nbn LDIrJm&Ip;8O|Ħ[5Oluz~R s҄bR-14&Fab6idAtz΃.wtS A(_%$>T銼 ح,1 NY<CXʈZTECb\ck8GY4PwW[ jf#ij½E\JD [ZdN0Q[!e7 Q%އ)D~L΀\L^=)UgEY;{qlVLT0E#'u:^[>yh\A@ѷ,SؚFDAKʧZ/^( hWO%3^'B5v* {F6dH6>N~|МVmNiCdw9[ytj*%}//wQ7\}&z"nRY}n.Vw,(-EܦU*Tɜt`ػ .ؐw;|ݛ;)R˲D Bt+V5Vz[F `8{v_.6۔(7K' ) FH)tg|D\R/sSiYsѴ#ܡDsw,^WNLh%y `gTHY"){&9ܒŶ;jq)mGN.3au1`iL3HxʀqXn0Nh9's8-\# d Wv3M2*tZ*V闾}#qRW5Jk|F&D@N#յ"4pB"0\ -& TP[[E[iUEFP%9O[i*4> g b2NVx[Ἒ4.yfF~jnu6Ph bP+ *N_\ߠ>5  fnUcpj4=a%{\(iBj9SOѼu-2Ҹ5 Ze,emT2T@ *OAhqu7|BM0i-MA !,UwqFQI65W%ӡafl@oNJӸ34stMD ԟ`$B?9-mn鯔)qFVlhۋ+o7>Dm?߭i=O}ZO}9j=U`HQ@IkM P5&_D,_G#3DW/:[,RVi}ׂo>>Py] -drjkgF?T+@Z&0&0|Pyɔvk jD5{z( ڠNmnlK-%<ԨI>1,g5@e`H@/B at@S%Jt9KvLl^vԦ=#6P \drT~J)r,(o؎hSK7h5͗-_'P Il;uS(a3R?j CUZ<ݦdR1$H_!TRtYpԩh}D ϔ&}ſnKN~3v1@}Ћt4Lpk‹dev?mL:+#f&U|^XOѐhcK !e!:g?kWpгƂCٟ[eq9?pqd'@H QxJA0Rw2(Y3!uфM>~<O!WbTha!Y7o*`PxUeŠqO QWjJO؋2TגXРSj` 긜-˅xdQT'9!$5$MDŀ& mX^::@hy0JR{<:"t^s;IM)=ϓ`m4W{薽{J)Uif:MyQ0dvyQyɃh4$|7R~`*`(eo<)&NrQJ3CG>w6wm8|mE{Ny Nys?өƟ8J)Έ'*Ǽ^ElbfҽD |9;0q #BN ,<`=HKr$dOc"#Zی)[\ ,"g~ijXHo CW’nЬ̑qXfcBAs硠0R ; 0)3)é܆Pw.?hgI6j+~q܇Rz~];vHv+p $)&Hي}{Px)GE .l-.EY^! KKL)]_zTs嬬|x{)4;T]r5n`%)&j|+ysҌ'GQF N'_ӌ$!\i?Mli2 aDž`/2g鍪cSdˣI[`%Y0֦VB`tbmVCneߔ97+ ZMJ2m/We <{nUi":U(gX[^GVELljq.i7N:V.Su2m^g5ݪjrnuH_\Dd1RݶvyÚ|nxVؗn;)OQyS~GCNqfҍ QYA+n):U!7&~}&WQND_3!1֘l- h7<`JDѮwI'6 `&+ɿx& e$0SĤ5kyY9 d,ƙg>"4)s/|-f7έ O2+86,Giev iR4qS1U6SŦJܵÚzue@z@xO\vDKIRG^cλ$@JF*wG" ڸ m>^iS-RwJ1Zq6W:-;!=G!|'3hiEW:`҈i rh)BXxe+cW 9&!:$e[ߟ<{vm&A^wOap4-6eOz3 hs8jŒ-=3UB4{<Ĝ]yLKzR$eZkO(,D|}dlHJQXXRGɟ_v߭ qiX`&-x| n$٢ -{k`KvFðȇ=@-5p|]=lO/ !QbW0^2)ʌ-LQR#T sNj VZ< GSۉ&_Λt.;*? 7{؟'^4mk WSP5kZ UtL޽zg& kMEWfƽa~dmIli$:}E|CHLir |Y6.dQfQw i{V]59񦬃Xdًgm&θѦ[sa.6t7w?Di'^&9FWH?7Jy(E6nJ!aX6wl)ɽX/tGIp9hu(…FupvE]:8mȏ24XB*dl|&Z!(U Q^@Ec1>Jmo-q( QJ})FQMC/ $4D16UJ.t3_p:jL:3Tzu`ڃvN݃SEAG1GɎ #sm *Ḭ /OuJkHjZz$ &cX|,8[P1ȆVWOX*?6хw9"/`k;ۣ i-klJۢՑ}$a^@b1*8% AWf\G)ESH=9s_ωAk.iY[sbھADv! ?(rQR3474uo>pa .h^3 SLg;d<%8Rk\c-!}\s0C Vزͦ|Sc7FQ+GE{ c`L7k3B%(Ra(6\ӊ " j$RqC.qxQ4+#V#J򻘋v JX3Ә L5E>tBT)B|_0@G)L*^B븣kB| 2Zoth=]A5Z ܁%sKkha %G:BJ`E)ΗC`0i\2ˆ%wj) Z;r)5Y ̏GQ.rGyCuq)X۠`+$*H9M0׵ hArաZa$k"~Z:asd=5dKw移[) .Wyya^ui0 ʒ󠷍ôq[yE|yB&:U޶,*Sa\ [y@HBA`sr=[LwDrH!(BrE[\%zdN"i?v$?Ww[ N>%^/Sp4!J4שg˹SOt㢼Z:KxG5EWK .oԕ%N eqwH>k$,- jWZ`xڌJVvVC^llKS[ _xU -dՄ#ԴusLsX Ԅ;F,ةf*gҵ[mU$9cb͆QX~8,Y]ϵ4S mXB"־rWqLQlLjol$*LQjRnU^t5+~NԗHs O>J/QZN8I,l k̤GHB<(^) 4žR* HD%ykƘWW 8A]j"` Vd(>NF2[I.Z5e i wUav児u}A9p0 XRYF$Vo< Z"lql\IO>9^ntt uG^*><9b: !~{whmBZIKhh倏1m̏?<xIRfDWQg@"&Ћq4HD:kJ@D)L f~BjăqEda 8 jDDTc?7hE%QB>Q06( G&` h,LuU"2B vq̅21ʓfLr=k-%ϚeZZ0Fr0aSv OmQIRT%@?n2FR@h鰙0S҂aHɻBdaQTFZ5Y"LϽ?R iF"ov>~|݌7,uB80X=F[3̌_]wy_#$B n^ ޭ}Ƴ?WOS;k@EZM|PL>L>L>+2XH/E>( c?^1:0gF/̓H(x##BzUTH [wtOiQooy͐\ S1rX E1?MkL~b Wa8Xy@@{ߒ pq,@qH+båf@Se4# U0`DC1ivT~,2U(+".һ\KtԛOnDknj8}ONЏ/lvO+eZ\<}x1l=z_8= }X[a2z{kw?/[8}mPer9&~(dA + ; Ԕgϻn et\]75/^{Nk駟A_zvÛg٣]z[v٧f&_.{wAֿ,{W;Nf}p9;5p%L'~m嘼Ux8S#,Hqһ27̐_w+02e?nf&nHn«/[9ܴ^rR׳R[{ ~v֌/{t(v>MY-8x5! "`M=8XȏX?<>ոwvoN[ rdG̞y=NG9 0%{z;صI @%^9%FOƠ'LڹD_5_}?'K0Vp##.,|d_i Rԣ˟K}LYzma&1^nUr &ܲ 朢jt66Գg9%M%[Xv+*7Qʧf"bGEnm~3KqodXӿaZ~e[zۏ0Sώ<߿ߓ_bGnqVzpC"abͰٻ6vWR̊ g?U+qb>J~HdݭG2IR* A(K(Z۫ #@BGdVe#쳖=UoNV2v^y'U;AQ*B] a`SuAF؉dstӯ5V/[RרeÖj3o)$Y>v(A/x:mF=jOt{떥3}c]c:Iڍ\|Al'Mq"#B:qZ--At魕! Lf/~0'#$2 X.㇓lP ɟ|@P2οz=1J^7 <ȶsw Upd 2;ۭxʵM\ÇR=B|=s'[8 N*38Y>+ vi^<}rw#(9G_98selTC5~۵4ӃA& jǹ)ܦCeq1/ o6xOD :jr U_be$ހT`snQsr֎ଃ)Ig,qoSZ>"QJL(%ul]x1וZWw\h-dU+BM8TW?$9.`x]E]`n9 26^~-S 8up~ko=%ϪgUɳٶ i"!;KY002R)ZbK}}`s_4/fѾ(g: `cʻ1rIh([9Tj -Xt% $mLcElOVXDE8XR B_v1F<=Za_+?Ub睔[wZJ.k20KGo丧QQjaC΁ [4%d+8IhCh m?hfb,|Z-y.4 6PڐM:)PJ[Ŏk XcPF/(NC،]F^-6T;D~~=|_18@~ӚX/G--r';OmRfz|~YcXWVmUW[ޅM QT(RtFu0CHhP|`@5ȭ3 V &-8`ooW翻*EJ<>]w#~7iiRF HOԼCI<մWC%cxЏ+oo8]9S.\SRwK"t@ݵ8Z F9$tUa8"` "4["T hQE qSX~+8MMd#H W"O̞C1:-B=}|`HҎKsUDұw-JXc)sZg 4SrѬKylMSd#9fAZȐO.$v/?}rc=}SδXx6xdSZHvz7}@lGuco?;2Ŭsuǁ'|JE.Z`e`yp'8M?| ;ͳG8\g?uPV4{z v^ڑ}ӓm|ǎ;ڭ̩Ekv\XdHMoYu-I06̊n7&M٤%bM]y2thC3<%nP\`-UFT?+f!`'-l8.-) 9FN4IVc[Aְr^V\]:,'>F'HOm=Q%.Vfv8*u=P O8RA ,{~l&&Wߖ,>}l㈤qВ],"dzm}Bo1c/`\D0 (-ZOi8:W\CI`s}RM t} %TBEŔ={3l%t&Pp:օyoAPu)P4mlr4СgC}lPlH6'EId]QΨ('|D vJwwڏr6سp-+.o"-Jy)B]tҙ[wYqH"05auwżv|6JEL;JRdEG )9 'N?igTb3BZuBqomr+1tցF{˼z'*8z+%8VH!e@TRxdb| Y}n"-d/a?Hy wre+cښJ_!]PKGVlLcz/uEX=}3# 9Ѯ-!({ʼh|$4d)!EvZx_NEw9f<ɗXh-$!.-Bb4fIsmLdiMdBtlC|1/d%v[y_hfo;R]\"-)}IR⹎[xF{Wݛ'Bϔf3r>ߌmØyT J"LSYI0d)^3K<]E3?˂Ţ[Gpt]#Gs% EtsnuCJ֥궫/o{u,rpIc=T/# D{%ET1es6O}?c [v"DX)b9Wik7\1+4IcGk2ĦKLjڮ&hwӣZ{>% \K&!+lEÞȶEԱyv`fLm' 8`M<L[ōa=&[@&G$Zw<Bv-xPd]N;ʎAXQ܌7D-xЬ*غ{;ضV$d`[Ţ:jleĀy}B42Ømi0n~?<tWp.[l-VqeN>7*te/m3`V* eSQLR\2u&m}LZuNTT>Zwd6 B|.Ţ+^[ vDëD0,8)~lFvGasۨ>}H5Wމ6mfZj)ݭZGTڍW-}Z*nyR't˥k|J[dJFDl`&)5: 3Vc.JB9Sa^yi$+L!_*uɬ3 37vAl\ f2]/ BUnl6ޜc݃ƶ\bz wQ.D]d`~sv{[9&u$2Bˊb13Koyѥ ʳA'[`q:&H50@c| LY[ Sz0;XMj؆;3dzvYYN5#`HqYPڶ^*Пe?ߊ]uQB/T-Eb/`6Јe>dp:`|+Q )ĊHXo=XQsRQX5eZ0}R6E8\ɆOBὴ@2+I}kz<`oW| D"*BG쌃MGRАbnM.س` nG_ x@uU(X# *GIVGx[蹕.5@`] =Vڳhl4$c-K v-}߰~=bkWt*ثL>Tyu ̐A/1l%R[aNI:)<b,%gmW(2EDBzCdO!Z9kE.}p<\+'e.! 4{*sba[-,ǹ,jXϒͥrB<ᔃ'!P3JKF1uxNl-6XN.j##_Emaɑ"3q\EImKI.Unv5}fYX RO@+9N~T-hn bB=cQL V.CO4%R (Y;]kإt; #u)2Er֗QvHr; @}0=$;-`khs0ིg4^SiT0LWH a{+vܑ>4=bmT>sZUTL)pQlӇTZ4RDšɠZ{ x_f^!̤ Iv2BO)щ;"CrL*rdn{/1q&XS J O*Y2H>rK!zNzҽXjcӱx;~aR z<q'#Y<KP>(;bGxݺF{^FE(tƣ(26=@XY\.hG-^YGB<fݷtt|YyHI f"sߍ=)DqzIGHb]b<`emo??O|G`X~̽h)!N!ɺqu(_xtz <oëYHXz%c̳9CLa4k~Zm:<xESO%Hl9D]PxP*¥|VGD+FqAg}on63 x&yHz2QuqЏˠ7>}9_/Sncat2fַsSkyD:k)X=wf9=+^tMr\ lXWonat/}콟K`8qZwcw[T kCzre OXmvv>_r85B*ı|bNtߝӣKE[_qi?NCll@Y NtYѫf?tgԛ;iP{&V>+JTHN¤j\uz^>JmBo5-u8GЮOQmkm Z9gvvҌUUbG.pvSɒ `vz .LtΆh zɍ% ;VJv`y#Q?O+fxj"XI&,h(%I{b}Cv> &hk2ɸ@.f҆dr,HnG|ަ6 8WiQaΛՐqW?0oxئjbY?›L[n4,b"*`h- bc D/-"6FwR~XV&,C#K G[8lQE|to:j |V[I 'A=$̤dR$G!] \>Xd_A:$$0 p/@e.*Iee2B`Cה,Rxz2 ;6 G@.6ރU6 m[{fB4&TZl%f6Ri1U8Q}j.덙 OȤ%)Dų9K{F!eI{ȱ+,(lXNoNw"Y&$dIKlTVH2U.>>(T(( TÜQZJ­Ah$]`).{do{Pvj}KrO 4$H'ZG%T5#꼰˗\Fa1T%V lsē7c諸Qkk3Oc3%,u\A IO)h$#_ـMe}X8\ ]BαSr>D)]t[@]{4Fj̝4wNP#\a{XlVv僝8L iH!]oaVj)-Cۄ\WRFzVnE=aSd{H`{ݧk"+;3Yv4Rj紱1y̌@8seN,&MٞF"TD7<q@T%\ ZX#*tq~#&EPym*eC-Jf F4LLb=ƈ?k@srmDH V'2 a9Ԫb6]dWN0RN9JQ-h"soFw\BQ-9uL;'t4DVheSZۋ8-:`3\QE F%@|=ܳ'on@rՕPǡdjjƘ8+ٓV5TC2dB,,sM]bV7-Of7ӹ3MGO'iևopa_ٷs&Htky2pW^a^%^u*ԇ!'̟goHu)to^XO3?/Lt0JbuфmuL)%~2&%na¤|Ȼ`ƿBv&2VG d?nn:ShVp^c~w6Ƽw Zn6'̣x4Tn2Eμي7\RL9M;D0T?lh"( v{DW)2(CDžPM^đ'Fan&}vpbՎ҂|5QLJ3z09yB"@A[3HUCX"pH"R E刃\9WU=ܫ ꒠Z{R~5נ19<&AJj~b0[p5{x h2=5o&mOrRO$"P?l\\5uVΊ%~X&,BQ??BD*n:SsŽW: "H5 fb+2lNE vYry۬'\.޽&^4%Bi)}-.*W468uWP3lu5 JZ4 ۏFa54 $ " +ëDo0P)H8H%Ԍ9k<1IQR6pR7-[W*-9cm.u" X"Ku`dzAT"^_V|6q^߲iEq?l\7C[F3CLjJS]Ey:5s^Xl 8' <"V&`eK-{1ߛ@- Ѐ>qsY{5 i{ 9@c$38<g[hEMHIZXJ:B", S{f0ZJQ]YkmkPyܫU8@csGz&&&|,"΅G.x ^" 4~?'6n7O*.XԐx;'ܻJoOʏdz4wS.O*C Yc?c*JUw=(;;]c2~Zaݒrm$Sj=:h7dnMi":MQGx nۚ&j6$䕋h#v֐IcwԲwDt[驶[ Et{JSvd5޳u{h%Xc;svrTXd+$mXW]Iy#+<҈ZʣȈ} uY9#$3bu.^\TKJDL(h&W$W M-0qBTFn w1ʥ&+PCb.J:OC}5(qq{ P&udP0isk-{E3Ṕs;5$i\1yBR/&H|ιA#:6 KmMu88Ք>W>c0m=1r%265 2쭡Əfp )֡fWА(%zf/QyH'A-߬W'4T7.I7d Eh)1՜k&A`Oݤ!wz&(HᭉJW&ɝ"\-͠T8ZhO#[M66ׂ#ݧۛwi  CYͅR 5"$ u qIُ[-:uP#1#K3OxF5el 0%s5/ii9έ O; :MJA~QHSP_CM59O[JI)89)} u,'<1aQܓ2g*3*Alj#~vvArSr]첓ˇK|tٕ֚'h=MJA?FYE$e$,XNzB>{(@j2lZW_.W7ɻ_'&\yMlO-r/.gUNջxV%H$Y>tcaj"jI?<<<u y'.AO'" ji)3T2hhXF;eu|C:nJ)\5(ֻ9Y!П?f>ov5(k}*Gn_5f2f羅LKkJfĤnإ"Ul9,/S؃Ԗݙ]1M$3D2h%Q@"GP?biQފ9FEKq1{SjR@09Q߭\~zuf䄷S9DV~;cɝ +2#-H)(#ZY=xˈ,,n ZFw^tiRh0cY##s1pJ`L!AZ*P,#VYsGѹ2; ߉՛ơĕqp]:rjX7v!% (Yj૱ *gR~Z|M@T9"lnzUbyiuU[n:L4F)^8 2a!2_ %%Iù恗AKET6 b_ڒQ%!׎o4X 8Fcp52pRKOx(J7fV\X-{L 53B#}X@ڙnN\MW5 B1pH;X0⮺+kTAAZJ+yuVɥxAjIPA1e?nr)0Ԋ4ÓjI6`Ho&L< ^y;2s Utƛv,n*ftf"ΙhC {+qg):VK$Nu3{`\M/[+oҝj Px+>ՊN[`I*~n16}qŶgKIN N}c‡ß'=6?AjZw׋{}й]zW֧cUC6v%MCoI]*Aw_uަjD1k\ !\E{TWbк֭/rT;X/t dl?-Ӻ !\E{锠'qX7ʹ<֗9mۜkӱ[L6rSx'ͅ{f =?֛- G{Af~ؠ:VɨPOn7l[j a۪K c0-oxop(."y HC$ݧpD$WqB1a&z@.,h/wJIp mxԹ %Z9T^Kb b2  WuNZ}u1Upl!4B4"y`ґR x5kZl뺠f6A\rdso2 # 72ܛl:F)B>I?䑲'A'O?ǏIbm??RYTzXNIIIuqׁST2"|p)]蜧e)*242E*JC_|~vw_xI*L[Eېsn@0ﶡEg]/׏o#fwFX&lIQ9 >ݤ(w֐?<|I97 ō=5UyjWNTJmxpڮDVdؖ,DJٹG"b)Na̟ 8mr,jug9kVhLlFo+L")kp>["cVUBQF~p@L4n{6%8{.h9T c[P'J2).냣E>oLth7nآB WhFE!D id/yBC~_$Ze/77z(SK~}oJgZQ: t ІNڕQwjF=jݣJ̉R0g Zk TF9 Įzg?ݠ~x5BP-滇EO-sD".sT33 m FQ' =q2Yy-'3;N^J&RS 7L,6jMH-W7 KN(UsMqS;ƌEU8d`"jQv,`dn^"ߔûnzs_}x($D(KJ](AD*eՆYmsWGS)3otZ2j'lڱ^Q\t&HB 1j UKt8QFgD0/ 3 NGJH p,adH8͌=ÖB3I8'l m ERpR3.y88㹒ry.dLVsyQ^;*{DeM<3љ%Q2j]|,)ࣗož+ކT}ɘBp(tW@[0Z2}֘yl?QنpI X92o~T:%a>BG nLYJwW &wU]5է)fGJGYh]. -R_B` p(9;Hp>-߇7*Q2gSfO/)|jSdMI3BMT?@Rt`+)(=AޝsORI CɄLX-0'2pDi! ( cW-(JM0:@x(- -#-+ Ft5cpd"/CED$> {ep&a@@vs]Ee9Œ0J`J 8~_U_~SU)-? BU9z)w.<(0G)[]u˃6m ʔK6u_,i݆АWtJ˓?Z7)X4ĀN҇Z$Lmi4n\rju(e"?2'_%ەS6}RZ3H,"@87_;*.2chmsJ9ьc291L5h&X+ cFBvRR+< s%X)s^$ܜ4.l=k(f4K4ŔW@M*fIF8HgB95k}:2"RX0tݕ EyP['+lσ*?uc=VN>O%E.N~h1|1؅# ЂHiQb=WT:ӕ?Jǖ;\'?IԌs"C.tvm泍hBs )-:=[;PBq.LhW,$kkf_O/Fy/.Q!1qPW/4h*ձU#;z?ƾR{6%{߃ᙺrKq<҇ 1Ξ_qs$0zF'!N_9:#aaF0E6O.@ag{; A3 n Y8z7 wT<8 L *MQPsx$櫯P|;gA/ xhn( QiԾ9}C`Uh%LVK3mK޲>yJe1lx}?3_7? (`ɠUAL'PZ @Xz:sa2lN5 ?nh<5$UĹN$t`b'2⢰9ɀn̰ eG~-uOM~%ߺyWGlCA\jTn?&7,FF~T}ґ?k]h7 WF_N1橙؂f-m.Zy/rPXS%IMAx`TF)Xeњ47Mxz  #,.sp=n16Y' A2"5N;)zZy3C'`Vwa C6'YXX'IӖW~3?kiT^f$3=hl|'qpGH).Uҗ@A(+:\~OϮ.v1Q1WGF#&UOt{cC5!Qk;C_/|7іM`P00A63z3Eu%L$aat픨8CNwb/ u3fT@`bd0[aP uCӲXx+)Nu3 5c7'$ ٸ|A)y?1)C aGκ\М{3;N ]NF)@-@izcSi(4麖Ӆ-O>dõ51'(%o4PBRbJRyu۝4]D7/ʏET^\V%J ^9%]Ɏ$o]9C|<'Q: 7:9Vw#.Nk^pٲž B5"Fs7u*9hjmPIa/̖O<~$j {6eYód{cΜR%ԭqV̗=|{7{O.*Qh<;ƞxB9k`l5ᐤrڱ0Oykx {|qxX$7So?F #2?F O=Zx7͋|I4%vd~pla@QǏoīi͒K0OE^{wB>T,&[#zuu=yW@#(@=-|GKI6!l"1U.{=ĀdEE +_^_w7,ޮ <2_jWU>~O/CZ97ArgBN[v:-z :kO yu.c ݇Q;Gy}ss7Bs?pO9#S\ 59"oԷ6 򹝕ִ#i_wd?ՇiһV,bS. ,{6j`ɲ˙*a<.[Qpg\JC< cݬɗ`&֬"QdML)@d2JYZ>[e*LJ}F[S bxedYϲ>]?n~<{xfOՅKF Rp, ^rYD|ٵxT>-eb\\rJ LL sN|wBO@7"!+IߍΡV1뛻B]mk(F($pjI9}rD\Ó,ެX q0nd.S^o$3TV&DB1,f`h|j:|[vrÙSt[O>6b@ "2S3̺FCfp-r@{cqc msy#KjrAD KVBEM!g*:昈|`,pXm"RJ?^_`RID .&ҸFHJ , kcV uH޵GGq0vtpa|n>[2D ʒF@AyH{eJA%gdvM1N* p$+OF2Ѩ]'wA"/A2\uÒ7=b&+ cg{ՍO,$BB`΢dўmۘ܅pY#b! bGd[n:F:<+}@|3 _^IkyjfydKR)fgI/wXinv{"@EB ECUE'|yεX>}ͥpk-1ھYe8{$؇@D[ tM %C@)G ˝Nf^#uܖE+%Y3 X~vYyϞLEeMN""-&=v}H[;p0r}s.Krxqd4vϐ3șGp/X/;&_tpaUxK*CWNSR`6 G8|6bf<ԯ@w Z}E1RaGc2l]s,=Zâ7zA!1hV\$SՏoQ|ׁ! +&}Q>xo0MQϊ)XœP[I"_^~{( m$}vcS:8gM@ۓX CO,hrŕH9Gj3MD}NnAC%dLZqcIUc4*:*[V١ y8}rvZZwb4 ^MZ\][r!lbm"/)%ZYHg/ …mx}G*}ǯ1;EA#I/{Z{WCd yA_EZ$EG{}(J"}mMƈcOR L lDd"rɓ!}?p"] 6jVPhe$m 6mm/mwZR7KVGV !-.1_YYG`,P|@ç3Dvi|j)3:5᩵ק83LJ#t-Q̋,-F[xR.ag8.;$c.̜Lȏ$'dKg@*6YIo'A.4"E:ےyb5{2l"1JPN8ZÛ䩝ā% $2H2fet Z=g>+ϴL8ƹBڪ ;}bj8g[W'><G`)B)8{d1#9{Np2M5:ZϳJء7g>]ن+]-- NUWːaGsP𨆻JFK%m/d5we/0kTN%N{,I;PK.y} A.(qYƔ᎖_Yy?\Ec[&N ~:,yS80t,f sk-3<ᮒQ(ndLZ[H QJˎh¼ᒹhL"K)'yGiδѽgߞ*-(7A2D0,L x4vB^?`٩$:V2F_7Mlh+(s,T}"ҍbTb3Sa~=.)̿W忶], $q껗Jd3څ58Sᱞ#޾r_Z>{]ub*&cNY¬]Ug?*MnNxMn\N%XG 6X|#DDdSbZq*iryzz:h1M>, a/:@I:f/:nQ ^lЮaS}~y.sT!Yt9z=׵/{sym:%mjP(pQB3v̄ǨRU86۟[Nj8W~uh9' a~TP.wpbXAV8(}P˓)d?x u94L@[qKtS.!sCN 8 Stc^\)m*DpG*FͰcy?~m]4=YOkߐ cFArP>(,萘S4z)lЯu{@N"I9m|-NN?+C"O[Obr|eKlp&$QbL#a`4} 9{}7|tp^oQY" sH/RdĹ/!'O`}Hu].ݢ2P ȴzG>(`2;Mצ/[~2f | AAfHC d)ө1mQ^}noÐd D-nor1|`K)/P;_Vkћ$Z^ӭmn58+y/ #1@1B>PcCsM[ y&;6i/㵴$J θ ܋ }(%ٟ?v\{ϼwc|5vG-%rF8 b6*t#pr+Nnuz4g?y0_zy/Ru,U%GgxO7&v=rKq6d1/UqN% ~bhYx>YRw&W*gU K}RVtw`$΄ܞ7#q+8җ G m*Gb+:^ V'|I'~e`VIZԔL# eaG7NZuNltW+R[pTq 0N{襑ؾӭ܆6^p-TW=;ϋd|31Do*R;/#&JGc|3C3`,/tOP࿖My,[4㔁SlH !cj@$NM2*B |(H8OXI*cs\ӐF*bZK gm|7˲e8KH>泦I1\3bQ֚%"az;zh0`R=QN4[?oauNhGԢr)K%pح)ݪb"^WLF[mxw5;^U|N ƀkQV$'Fל :!XXeA_0(I3ҏnxTV@zEa>-b .5ᮦvV0Ji@U]G,1XEEߣᨫsNh3vhvj0 Ӳcض?`ߔYY.PzzQO,}? 8'30iG~a2h|j3??RTvrUc]V/V~Y#\R V|x xAdTjMU# ,UQ0+B:lyٿ5̻gPꂊ E3l۬%!-/' UVb$swN6=x(vR 0bpw[]^ |׻ӢY&.n'N&sUG ',^, w 2[S٣]QJ$N! F^ͽJN\b h-.rqP#*7Y#]juv%f>nq2oAH 2jlz~Rm$ Z$+chsd\C_-%DۉVL3 .ta0|s=Uϝ ˷F?]#Ԯ᧟Φ狭5罥폽MfMV mÄj{Xڬqg7ǵOiRaQDsr3oc|8U4ٗ $2k]8+^sG+1/?zao3̼T[$zTUwqX Q'SgVo^)KSe}/WZc_[|+J t,X)eVí ;=k+#w߲*MVO=/ )Ae^d,Y|TyT#mWW]ګ#bx/|_Oa~לzEsvj/ﻚ+ݏn{ө VӨ;y2xaXsn! ~yaӿmw9u|kZ1a@7~3`uaPnʬ|+ ^("g8Ǘ,^/PaqR^%}p ɆdCnvb|$% ϼ`StQxotvTSTaJMӝR[Ioו{?yAWdz8Ġ:5:\#oAY}kW9hGeKoh0ΝGp~ ƾ`Fm_zpO~ޕq$З>>KX/ ;u@is$#[=CQCz8CiDQ:sOY>ɏ'ixճDղ93ߴUZ`+u[ciq̙spVTWh [̨O)HDB'6j P` F 6Mp<1{Oh) Wfc"|>oK5?x$SfO=ٱb3ʇ@v,8RA/ Mp Q4k&CA/R@y~n]i*Wp^Y]d;$%S]ꃙߛ8,5)ym!%2<jAٰnZC2E<&4Kd^`*i,hSط@E(uVL\/;jb% >1{+T/ٱbU!꽯o5ogy̿2On'?sߘ[zےBWJp;r2ItJm[O+87ubTfRPNCXSZ!LhTt:o5'yF>HyR-p%?=bܵaNrq`|~ܠ(fm} _}p-V01o M(gsYIIksiU|lGJB @5tu< Ļ5T>ƄvSv8Ըxyj)2EFEN0 uR]EZ?A4% _+lĕFUV$-$'ɱ9gݔ/̏ ⩙ !9,l\Vz~L ]nޗ'wzaku X"`nF`4+1FC;ОN_]A[0&T)Mzj\i%zQ4ƚzFdr%+/ &idI'0ݲt4`KiMIΥ֖5F{bכ;Zj5޲%1˽ JףoR] H R H )KFRzjS (P- GD5OD(R$uzɜpHw3c|1a0uYp A TfZm[8Noqf}0AFMXѩ9!;/_yzƐ+݅nbv8`u4,4Nɲ. (Da^y(f1D8>MSqmbdBkQؿ4u(l^ !4M(u!81--Oی6& B}"`}(8ddܤ5LbY zFSՑ,EHJm$L+$w8FA1j .VjDPGV( Aȁх`l.{nչgi8\9Q`(+.\p>;/pEta~ դ=ns;R¹^ˤƈ}?eiJV-pXQ * p5-r 8 {骔xߎ t37 #.S-|nhQx7OF_ׅk6p7]T1]fB E~Ij-ş2/cpY f0/,ILtIme}Tg/? AzN X:RQ2QX줒tZMv[vdžK+wǙDEf\La|va%NdKvndN4޲e՚aq{?_0F + | J K$yj|~Ff }0QfvmRT|'*L0ː";xAU*L943Ԍ"Dm'؏#G#)~L /j̥p0eLSM<ډZ{^Mi{!"e'S>*s+c'uWk.@R.čɏU6LD )+ 1hsҎZͣ #Oe8ş;Y-dbb նyۤ7%z*ux,Q"FҞjBNDy Ѳ$D`8ͺOaFJD+8 C;,0 FH+R4A*#y8(k3E65T/PM`>h2LBuT/P [qZQr'R*-B9E" b9mAR,LpQHaN.ٜjʉt3 &"YĄ jx癃q%T 6JZ˹pGciueiJe\?yO` ~_Ǔ<T> ;7ݽE5p78h/ c~̭ajMvUFpywupägừKvI;u|6 BZA׺Qo&Az Z].%rKc`xty`-TBQUu-k:k tj$ԃ~WRb|i97a4_ϪcRY蕜 fEÐ/$`hPA O2D({Wldd9/( Fq@"ZI$.%2CH*1ri™‚)o~!N>%ݮ>_6.KJ+q؂#@xRAPi, ɖq"KEZi뮇z~-z~EJ,X:L~Y^{&c؏AؕеuE7gMBAptJ)k'xFBPQ | ręZzi[:LE݄~=L|WE\zl; 4H"|H i^TLR KX,5-y5Dc=%6jQ)QT⡴4cΰL2D=&P͐'Xۛ+uy>Et*Oʃh'\\wOxVQFXWfN#UBQ(mx ͳU Zԧ#`d*eH^*0]Q=S^^]e%֦ܔݽgg Jcy$ee%I,VY#d6ѻY#jޝhmzVD6@Ƀ2ao @u}|`-%1YGX歛/ZAWl9{-T}M}DZ]n 7ڶLyD~wYwW.wk&-BkZݴzB\NF'M JTH"i};5Kw.%4zPr^ 5v]KKKQ]Bwh%18< nDA" ් nn|$U}z#XL6z sܗ,SZ<^G7TԇxNMTƜuT/RMXmꨥi)I޼.ZhTgs9l-. Ub1㯣|j :jk)Ɍ$UZqUSs;.xT{G;'^NHfZ|C8*~n{knxd熭<./y t9E})պ X,?O[Fc 6EUGKeڦ}bj_vt}=)QKbR'6t h@r!bl B]|:]!; Aޒ_ 嗏?u@f(]~*Rf ^xzB3Hvb(%%8e8m9(dBɡ!`M6ͬ5ұd Β;ț{CNxlZ]}er. 6$شD0O΍@Ԅm7-w-'1cC3;|a8? ף< xSte.zw-mڨnxLnoۓD`ZC†$D@t0$L A I۫}x핱!'xʾBIxL8&hUrbf_9opf]ـ`k:ś'XH7jVg" \P]nޗ'S5Һ,%`Jp]8 ;d|̡=j g0IGXCzȞ<-%/9:#m6/ ڃ'TD|$ 0+"l)!bAho{۶_w}>}i]kѴ+}{&-.)-K.)Q -qfvE3mP-3:JPmpxNpp2ﱵ!J THQZ, ~dGqU';8HY.dרt9 Bzv]gHM;.:ox68TPM::mD#D <91 jyuL?ucLF8 f=sE7Xin9d03ԙbrL%c)#PsI{Fp»t mjt%MBiB;xƩ~R# p;YuE9kHi(PeP6ڑNcOVb|)Y1h#)baKK 2`t WnHf^hҐFZU,.킥(ƽVO) PmHBΕ`΀Hbb"sjZJchP g 9idCpfΆnXιT|YhOP kZ!7DK_%dU6S { [X׳ƙu=kYIk3i)$*琽߷,n2:)o$Q)*0 ƹIrC&]AhoQ+kn^yl%D8wX"m-nԃ݆3N>{, -#rOZ4XB웝qҚjhDur_J~R9mr% SGVT-BciQ+3/JW̖_D]{;[ݺhhm7n_R2ͲE#-1 ^{A<[T!51Er6ZSrx-+,#e!O$L~ݠR$qDʃ>v;e[zJsv!r&aR3T_vSd[*b&0 LEj.'hkLqB;k~lv/g!$Luj yJġ A8T\3JO4ҐP(Y( 4M.3JO,3o;4 ]"-~A5%~g"JAơAdJ uSp(e(?\7ԃ:SpsZv1=c )xeo`_N lIOn׻K!JֱD}m T %eX6IB!Bb 0UCFEVIgmf˟F_'aڟ9d[? I:o$l'vFVbu>dBhζF6sB,zf7yww)+pw"W]~LF RA!=fB6el=T؈lڷ "JIp,V2܆,Va GBN#1W?~Ӏi;lL(%\PmN=cLitT ɝ N%_řXJO:Ç!V`:b>g/ܸ#'\(,:WvRΊiǘ%G2)R庩1d8fB$$s() 8rU܁pi?)W;ؔӄv[2;8bLd$TgdleRPF}9[k5wڳ\pgbtuy~j\G]ۆ$6$J$\tށ/KFHr@siD%yUٟdG7E/'dQsOW8a"4M 9gyֳ e%C!fwş7CMکa<\1ÔGآ rvvvfg3Ǐ[" rDlAS,"&!Eg2=ڬL}PRe[Y X71JO2@ AcQ+=7+*)Uq,;Nlގ3Ddn~Jao{XMϨ var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005336720515157432555017724 0ustar rootrootMar 21 04:47:25 crc systemd[1]: Starting Kubernetes Kubelet... Mar 21 04:47:25 crc restorecon[4688]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:25 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:47:26 crc restorecon[4688]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 21 04:47:27 crc kubenswrapper[4775]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:47:27 crc kubenswrapper[4775]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 21 04:47:27 crc kubenswrapper[4775]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:47:27 crc kubenswrapper[4775]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:47:27 crc kubenswrapper[4775]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 04:47:27 crc kubenswrapper[4775]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.424970 4775 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431082 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431105 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431127 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431133 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431137 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431142 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431146 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431151 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431155 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431159 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431163 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431167 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431171 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431174 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431179 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431195 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431201 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431206 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431212 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431217 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431223 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431229 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431235 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431240 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431244 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431248 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431252 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431257 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431260 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431264 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431269 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431274 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431278 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431284 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431289 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431293 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431299 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431305 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431310 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431314 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431318 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431323 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431328 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431332 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431337 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431341 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431345 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431350 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431355 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431359 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431363 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431367 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431371 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431378 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431382 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431386 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431389 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431393 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431397 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431401 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431406 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431410 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431415 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431421 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431426 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431431 4775 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431436 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431441 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431445 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431450 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.431456 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432104 4775 flags.go:64] FLAG: --address="0.0.0.0" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432135 4775 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432145 4775 flags.go:64] FLAG: --anonymous-auth="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432152 4775 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432159 4775 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432164 4775 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432172 4775 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432178 4775 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432183 4775 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432187 4775 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432192 4775 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432197 4775 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432202 4775 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432206 4775 flags.go:64] FLAG: --cgroup-root="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432211 4775 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432216 4775 flags.go:64] FLAG: --client-ca-file="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432221 4775 flags.go:64] FLAG: --cloud-config="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432225 4775 flags.go:64] FLAG: --cloud-provider="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432229 4775 flags.go:64] FLAG: --cluster-dns="[]" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432236 4775 flags.go:64] FLAG: --cluster-domain="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432241 4775 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432246 4775 flags.go:64] FLAG: --config-dir="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432250 4775 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432256 4775 flags.go:64] FLAG: --container-log-max-files="5" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432263 4775 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432268 4775 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432274 4775 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432279 4775 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432284 4775 flags.go:64] FLAG: --contention-profiling="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432289 4775 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432294 4775 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432300 4775 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432305 4775 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432312 4775 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432318 4775 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432324 4775 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432330 4775 flags.go:64] FLAG: --enable-load-reader="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432337 4775 flags.go:64] FLAG: --enable-server="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432342 4775 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432350 4775 flags.go:64] FLAG: --event-burst="100" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432356 4775 flags.go:64] FLAG: --event-qps="50" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432363 4775 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432368 4775 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432374 4775 flags.go:64] FLAG: --eviction-hard="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432381 4775 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432386 4775 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432392 4775 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432398 4775 flags.go:64] FLAG: --eviction-soft="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432403 4775 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432408 4775 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432413 4775 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432418 4775 flags.go:64] FLAG: --experimental-mounter-path="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432423 4775 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432428 4775 flags.go:64] FLAG: --fail-swap-on="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432433 4775 flags.go:64] FLAG: --feature-gates="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432440 4775 flags.go:64] FLAG: --file-check-frequency="20s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432445 4775 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432450 4775 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432455 4775 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432461 4775 flags.go:64] FLAG: --healthz-port="10248" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432469 4775 flags.go:64] FLAG: --help="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432474 4775 flags.go:64] FLAG: --hostname-override="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432480 4775 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432485 4775 flags.go:64] FLAG: --http-check-frequency="20s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432490 4775 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432495 4775 flags.go:64] FLAG: --image-credential-provider-config="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432500 4775 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432505 4775 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432510 4775 flags.go:64] FLAG: --image-service-endpoint="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432515 4775 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432520 4775 flags.go:64] FLAG: --kube-api-burst="100" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432525 4775 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432530 4775 flags.go:64] FLAG: --kube-api-qps="50" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432535 4775 flags.go:64] FLAG: --kube-reserved="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432540 4775 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432545 4775 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432550 4775 flags.go:64] FLAG: --kubelet-cgroups="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432555 4775 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432560 4775 flags.go:64] FLAG: --lock-file="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432565 4775 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432570 4775 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432575 4775 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432583 4775 flags.go:64] FLAG: --log-json-split-stream="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432588 4775 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432592 4775 flags.go:64] FLAG: --log-text-split-stream="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432597 4775 flags.go:64] FLAG: --logging-format="text" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432602 4775 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432608 4775 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432613 4775 flags.go:64] FLAG: --manifest-url="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432618 4775 flags.go:64] FLAG: --manifest-url-header="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432625 4775 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432630 4775 flags.go:64] FLAG: --max-open-files="1000000" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432637 4775 flags.go:64] FLAG: --max-pods="110" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432643 4775 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432648 4775 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432654 4775 flags.go:64] FLAG: --memory-manager-policy="None" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432658 4775 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432664 4775 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432668 4775 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432673 4775 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432686 4775 flags.go:64] FLAG: --node-status-max-images="50" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432691 4775 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432696 4775 flags.go:64] FLAG: --oom-score-adj="-999" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432701 4775 flags.go:64] FLAG: --pod-cidr="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432707 4775 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432716 4775 flags.go:64] FLAG: --pod-manifest-path="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432731 4775 flags.go:64] FLAG: --pod-max-pids="-1" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432736 4775 flags.go:64] FLAG: --pods-per-core="0" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432741 4775 flags.go:64] FLAG: --port="10250" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432746 4775 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432751 4775 flags.go:64] FLAG: --provider-id="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432755 4775 flags.go:64] FLAG: --qos-reserved="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432760 4775 flags.go:64] FLAG: --read-only-port="10255" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432765 4775 flags.go:64] FLAG: --register-node="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432771 4775 flags.go:64] FLAG: --register-schedulable="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432777 4775 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432787 4775 flags.go:64] FLAG: --registry-burst="10" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432793 4775 flags.go:64] FLAG: --registry-qps="5" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432799 4775 flags.go:64] FLAG: --reserved-cpus="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432804 4775 flags.go:64] FLAG: --reserved-memory="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432811 4775 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432816 4775 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432822 4775 flags.go:64] FLAG: --rotate-certificates="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432827 4775 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432832 4775 flags.go:64] FLAG: --runonce="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432837 4775 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432842 4775 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432847 4775 flags.go:64] FLAG: --seccomp-default="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432852 4775 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432857 4775 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432862 4775 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432868 4775 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432873 4775 flags.go:64] FLAG: --storage-driver-password="root" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432878 4775 flags.go:64] FLAG: --storage-driver-secure="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432883 4775 flags.go:64] FLAG: --storage-driver-table="stats" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432889 4775 flags.go:64] FLAG: --storage-driver-user="root" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432894 4775 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432899 4775 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432909 4775 flags.go:64] FLAG: --system-cgroups="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432914 4775 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432923 4775 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432928 4775 flags.go:64] FLAG: --tls-cert-file="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432933 4775 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432939 4775 flags.go:64] FLAG: --tls-min-version="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432944 4775 flags.go:64] FLAG: --tls-private-key-file="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432949 4775 flags.go:64] FLAG: --topology-manager-policy="none" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432954 4775 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432959 4775 flags.go:64] FLAG: --topology-manager-scope="container" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432964 4775 flags.go:64] FLAG: --v="2" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432971 4775 flags.go:64] FLAG: --version="false" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432979 4775 flags.go:64] FLAG: --vmodule="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432986 4775 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.432991 4775 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433108 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433139 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433145 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433150 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433156 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433161 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433166 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433171 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433175 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433180 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433184 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433189 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433193 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433197 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433201 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433206 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433210 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433222 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433228 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433234 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433239 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433244 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433249 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433255 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433259 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433264 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433269 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433274 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433279 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433284 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433291 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433297 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433302 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433307 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433314 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433322 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433327 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433332 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433338 4775 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433343 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433348 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433352 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433356 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433361 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433365 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433369 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433373 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433393 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433398 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433406 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433411 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433415 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433420 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433424 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433428 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433433 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433437 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433442 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433446 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433452 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433456 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433461 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433465 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433470 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433474 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433478 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433482 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433486 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433491 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433495 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.433499 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.433507 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.445079 4775 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.445144 4775 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445224 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445234 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445239 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445245 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445249 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445253 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445256 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445260 4775 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445264 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445268 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445271 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445275 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445278 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445281 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445285 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445289 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445292 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445297 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445303 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445308 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445311 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445316 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445319 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445323 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445327 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445331 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445336 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445343 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445347 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445352 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445356 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445362 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445368 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445373 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445386 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445392 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445396 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445401 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445405 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445410 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445414 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445418 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445423 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445429 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445438 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445442 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445447 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445451 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445456 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445460 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445465 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445471 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445476 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445483 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445488 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445493 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445498 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445502 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445505 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445508 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445512 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445516 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445519 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445523 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445526 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445530 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445533 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445537 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445540 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445543 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445555 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.445564 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445748 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445765 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445772 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445779 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445783 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445787 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445791 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445794 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445798 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445807 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445811 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445814 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445817 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445823 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445828 4775 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445831 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445835 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445838 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445841 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445845 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445849 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445852 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445855 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445859 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445862 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445867 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445872 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445875 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445879 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445882 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445886 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445889 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445893 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445897 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445906 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445910 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445913 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445917 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445921 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445925 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445929 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445932 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445968 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445972 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445975 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445979 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445984 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445988 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445991 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445995 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.445998 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446003 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446007 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446011 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446014 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446018 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446021 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446025 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446028 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446032 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446035 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446038 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446042 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446045 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446049 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446052 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446055 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446059 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446062 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446065 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.446076 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.446082 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.448669 4775 server.go:940] "Client rotation is on, will bootstrap in background" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.452838 4775 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.457266 4775 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.457524 4775 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.459409 4775 server.go:997] "Starting client certificate rotation" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.459445 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.460593 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.493205 4775 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.496659 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.497090 4775 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.514187 4775 log.go:25] "Validated CRI v1 runtime API" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.546058 4775 log.go:25] "Validated CRI v1 image API" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.549275 4775 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.555706 4775 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-21-04-43-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.555735 4775 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.575221 4775 manager.go:217] Machine: {Timestamp:2026-03-21 04:47:27.569507451 +0000 UTC m=+0.545971095 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b8f8f0e2-c78b-43d3-976e-2a86ca08a185 BootID:82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e7:06:8a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e7:06:8a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f6:e6:43 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8a:d6:aa Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ad:52:11 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:24:cd:92 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:da:5b:8e:5c:8e:2d Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:5b:74:dd:99:a7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.575574 4775 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.575899 4775 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.578684 4775 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.578916 4775 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.578952 4775 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.579257 4775 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.579275 4775 container_manager_linux.go:303] "Creating device plugin manager" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.579749 4775 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.579787 4775 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.580050 4775 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.580438 4775 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.584823 4775 kubelet.go:418] "Attempting to sync node with API server" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.584871 4775 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.584908 4775 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.584926 4775 kubelet.go:324] "Adding apiserver pod source" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.584949 4775 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.589300 4775 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.589735 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.589813 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.589765 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.589894 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.590470 4775 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.593528 4775 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595167 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595188 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595195 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595202 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595213 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595219 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595226 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595236 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595245 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595255 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595272 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.595280 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.597253 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.597691 4775 server.go:1280] "Started kubelet" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.598915 4775 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.599012 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.599036 4775 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 04:47:27 crc systemd[1]: Started Kubernetes Kubelet. Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.599680 4775 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.600906 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.600958 4775 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.601204 4775 server.go:460] "Adding debug handlers to kubelet server" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.601598 4775 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.601651 4775 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.601531 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.601869 4775 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.602978 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="200ms" Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.603187 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.603248 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.612970 4775 factory.go:55] Registering systemd factory Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.613024 4775 factory.go:221] Registration of the systemd container factory successfully Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.613622 4775 factory.go:153] Registering CRI-O factory Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.613651 4775 factory.go:221] Registration of the crio container factory successfully Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.613798 4775 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.613851 4775 factory.go:103] Registering Raw factory Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.613888 4775 manager.go:1196] Started watching for new ooms in manager Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.617009 4775 manager.go:319] Starting recovery of all containers Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.616971 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ec1da8a498a5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,LastTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619316 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619395 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619412 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619426 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619440 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619483 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619497 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619513 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619531 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619546 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619561 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619574 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619588 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619604 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619616 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619630 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619643 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619658 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619672 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619686 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619704 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619721 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619737 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619773 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619786 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619799 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619815 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619831 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619845 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619860 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619873 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619917 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619932 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619945 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619958 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619975 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.619989 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620021 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620035 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620051 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620065 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620082 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620135 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620151 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620165 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620181 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620195 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620209 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620223 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620237 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620252 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620266 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620288 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620303 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620319 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620335 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620350 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620365 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620379 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620393 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620410 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620424 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620438 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620452 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620466 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620479 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620495 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620508 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620522 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620535 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620550 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620570 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620584 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620599 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620614 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620628 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620641 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620658 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620672 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620686 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620700 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620715 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620729 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620744 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620760 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620776 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.620792 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.622902 4775 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.622958 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.622982 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623005 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623027 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623050 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623072 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623096 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623188 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623213 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623233 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623253 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623273 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623292 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623329 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623355 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623376 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623397 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623479 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623503 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623524 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623547 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623568 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623588 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623608 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623632 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623654 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623675 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623696 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623716 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623734 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623758 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623779 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623799 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623822 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623841 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623861 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623881 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623903 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623922 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623944 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623964 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.623983 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624003 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624024 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624044 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624067 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624088 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624108 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624152 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624172 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624191 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624212 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624231 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624250 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624268 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624287 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624307 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624326 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624346 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624366 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624385 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624402 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624421 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624442 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624461 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624479 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624499 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624517 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624537 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624557 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624577 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624596 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624617 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624636 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624654 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624673 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624692 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624712 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624733 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624753 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624771 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624795 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624814 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624833 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624854 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624873 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624893 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624915 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624935 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624953 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624973 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.624992 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625011 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625030 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625052 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625072 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625143 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625164 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625187 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625206 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625225 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625244 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625262 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625281 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625301 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625321 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625341 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625359 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625378 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625401 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625422 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625441 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625459 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625481 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625500 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625518 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625538 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625556 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625576 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625597 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625615 4775 reconstruct.go:97] "Volume reconstruction finished" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.625628 4775 reconciler.go:26] "Reconciler: start to sync state" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.639645 4775 manager.go:324] Recovery completed Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.650253 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.653425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.653476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.653491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.654136 4775 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.654157 4775 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.654232 4775 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.658543 4775 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.660012 4775 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.660063 4775 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.660097 4775 kubelet.go:2335] "Starting kubelet main sync loop" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.660168 4775 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 04:47:27 crc kubenswrapper[4775]: W0321 04:47:27.661650 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.661716 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.676003 4775 policy_none.go:49] "None policy: Start" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.677187 4775 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.677233 4775 state_mem.go:35] "Initializing new in-memory state store" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.702793 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.726006 4775 manager.go:334] "Starting Device Plugin manager" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.726082 4775 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.726095 4775 server.go:79] "Starting device plugin registration server" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.726577 4775 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.726593 4775 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.728020 4775 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.728168 4775 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.728179 4775 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.732593 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.760953 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.761156 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.762860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.762912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.762926 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.763093 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.763951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.763980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.763991 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.765396 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.765440 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.765473 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.765565 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.765616 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.766309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.766341 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.766354 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.766486 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.766707 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.766765 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767323 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767503 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767687 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767783 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.767831 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768630 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768662 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768900 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.768947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.770747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.770770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.770782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.803766 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="400ms" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.827435 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.828734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.828789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.828800 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.828941 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829545 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829617 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829639 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829660 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829683 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: E0321 04:47:27.829736 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829745 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829828 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829859 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829881 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829905 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829926 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.829976 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.830048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.830155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931602 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931631 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931689 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931755 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931785 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931810 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931749 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931875 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931907 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931814 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932034 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931866 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932182 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932304 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931940 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.932235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931904 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:27 crc kubenswrapper[4775]: I0321 04:47:27.931938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.030005 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.031876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.031995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.032102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.032229 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:28 crc kubenswrapper[4775]: E0321 04:47:28.033003 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.097744 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.106772 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.123428 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.147518 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.153070 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.163080 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-c5c1d5c259f2beab5080951416bfd5289ddef57bb979d944747ea4b3e7081287 WatchSource:0}: Error finding container c5c1d5c259f2beab5080951416bfd5289ddef57bb979d944747ea4b3e7081287: Status 404 returned error can't find the container with id c5c1d5c259f2beab5080951416bfd5289ddef57bb979d944747ea4b3e7081287 Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.164228 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0650b8396f80992e97f0325700c41c51f8284160620600d88f75773714af8b27 WatchSource:0}: Error finding container 0650b8396f80992e97f0325700c41c51f8284160620600d88f75773714af8b27: Status 404 returned error can't find the container with id 0650b8396f80992e97f0325700c41c51f8284160620600d88f75773714af8b27 Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.170771 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1ca29dbcf983f375a8a59480fdd65ddd29e56908139fcdd17346f8cc6cf01486 WatchSource:0}: Error finding container 1ca29dbcf983f375a8a59480fdd65ddd29e56908139fcdd17346f8cc6cf01486: Status 404 returned error can't find the container with id 1ca29dbcf983f375a8a59480fdd65ddd29e56908139fcdd17346f8cc6cf01486 Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.177936 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cd4a716363e7c11843b2e9634ff5e9e4bf620eb979fbe6e3cfb9e894c3c338a9 WatchSource:0}: Error finding container cd4a716363e7c11843b2e9634ff5e9e4bf620eb979fbe6e3cfb9e894c3c338a9: Status 404 returned error can't find the container with id cd4a716363e7c11843b2e9634ff5e9e4bf620eb979fbe6e3cfb9e894c3c338a9 Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.191162 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-47a8bc8f4176dcd8f7bcc28e3dfececb6304138c76584ddd9b3996ffbac6af07 WatchSource:0}: Error finding container 47a8bc8f4176dcd8f7bcc28e3dfececb6304138c76584ddd9b3996ffbac6af07: Status 404 returned error can't find the container with id 47a8bc8f4176dcd8f7bcc28e3dfececb6304138c76584ddd9b3996ffbac6af07 Mar 21 04:47:28 crc kubenswrapper[4775]: E0321 04:47:28.204861 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="800ms" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.433781 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.435443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.435478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.435489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.435514 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:28 crc kubenswrapper[4775]: E0321 04:47:28.435939 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.473192 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:28 crc kubenswrapper[4775]: E0321 04:47:28.473279 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.600646 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.642890 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:28 crc kubenswrapper[4775]: E0321 04:47:28.642981 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.664890 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0650b8396f80992e97f0325700c41c51f8284160620600d88f75773714af8b27"} Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.665790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"47a8bc8f4176dcd8f7bcc28e3dfececb6304138c76584ddd9b3996ffbac6af07"} Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.666726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cd4a716363e7c11843b2e9634ff5e9e4bf620eb979fbe6e3cfb9e894c3c338a9"} Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.667762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1ca29dbcf983f375a8a59480fdd65ddd29e56908139fcdd17346f8cc6cf01486"} Mar 21 04:47:28 crc kubenswrapper[4775]: I0321 04:47:28.668582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c5c1d5c259f2beab5080951416bfd5289ddef57bb979d944747ea4b3e7081287"} Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.674421 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:28 crc kubenswrapper[4775]: E0321 04:47:28.674501 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:28 crc kubenswrapper[4775]: W0321 04:47:28.897462 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:28 crc kubenswrapper[4775]: E0321 04:47:28.897841 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:29 crc kubenswrapper[4775]: E0321 04:47:29.006226 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="1.6s" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.236405 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.237739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.237774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.237783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.237804 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:29 crc kubenswrapper[4775]: E0321 04:47:29.238300 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.600537 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.601599 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:47:29 crc kubenswrapper[4775]: E0321 04:47:29.602514 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.672473 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.672571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.672692 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.673664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.673691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.673700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.674591 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.674629 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.674797 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.675574 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.675843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.675861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.675869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.676390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.676407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.676416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.677233 4775 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.677301 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.677355 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.678927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.678962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.678977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.680338 4775 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.680503 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.680614 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.681854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.681887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.681900 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.686207 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9299f9db8cbdd2dbd04e47bd1eba4f95724ec3dd04ec5083564dc0c38a960457"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.686241 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cb44c7daffbeddbb39314ec11bd3235996e19b520875337dcac1ffd14f35953"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.686261 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.686273 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b"} Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.686345 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.687070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.687098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:29 crc kubenswrapper[4775]: I0321 04:47:29.687109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:30 crc kubenswrapper[4775]: W0321 04:47:30.301853 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:30 crc kubenswrapper[4775]: E0321 04:47:30.301963 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:30 crc kubenswrapper[4775]: W0321 04:47:30.305383 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:30 crc kubenswrapper[4775]: E0321 04:47:30.305424 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:30 crc kubenswrapper[4775]: W0321 04:47:30.439566 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:30 crc kubenswrapper[4775]: E0321 04:47:30.439652 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.599737 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:30 crc kubenswrapper[4775]: E0321 04:47:30.607706 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="3.2s" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.689283 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539" exitCode=0 Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.689340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.689364 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.690962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.691002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.691013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.693536 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.693561 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.693574 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.693606 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.694712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.694749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.694759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.695538 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0cf88511498eb0f6182819ecb78616730fa9bfbe187057ea891f3ba73e550287"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.695562 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.696217 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.696246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.696256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.699855 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.699889 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.699893 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.699904 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.699914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70"} Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.700780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.700803 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.700811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.838998 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.844239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.844305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.844316 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:30 crc kubenswrapper[4775]: I0321 04:47:30.844352 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:30 crc kubenswrapper[4775]: E0321 04:47:30.845039 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 21 04:47:30 crc kubenswrapper[4775]: W0321 04:47:30.848932 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 21 04:47:30 crc kubenswrapper[4775]: E0321 04:47:30.849050 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:47:31 crc kubenswrapper[4775]: E0321 04:47:31.370094 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ec1da8a498a5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,LastTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.710036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c38b09928e8a15368f55622732324802c7af7bc3d13c7cf2d1d00b4ac43fef94"} Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.710175 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.712852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.712901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.712923 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.716542 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645" exitCode=0 Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.716658 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.717388 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.717955 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645"} Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.718055 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.718087 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.719271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.719309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.719326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.720069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.720107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.720151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.720817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.720883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:31 crc kubenswrapper[4775]: I0321 04:47:31.720906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.394111 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.394288 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.395315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.395363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.395378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722107 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722177 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722564 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939"} Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987"} Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846"} Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4"} Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:32 crc kubenswrapper[4775]: I0321 04:47:32.722939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.132808 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.276444 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.531203 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.730621 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731071 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56"} Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.731806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:33 crc kubenswrapper[4775]: I0321 04:47:33.744002 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.046211 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.047568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.047645 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.047659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.047774 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.595711 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.595896 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.597500 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.597753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.597944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.732672 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.732698 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.733833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.733860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.733869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.733864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.734030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:34 crc kubenswrapper[4775]: I0321 04:47:34.734056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.016885 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.017414 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.018374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.018407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.018417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.231861 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.394633 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.394718 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.615299 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.621555 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.734868 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.734881 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.735870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.735898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.735908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.735954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.735987 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:35 crc kubenswrapper[4775]: I0321 04:47:35.735999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:36 crc kubenswrapper[4775]: I0321 04:47:36.737466 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:36 crc kubenswrapper[4775]: I0321 04:47:36.738406 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:36 crc kubenswrapper[4775]: I0321 04:47:36.738468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:36 crc kubenswrapper[4775]: I0321 04:47:36.738485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:36 crc kubenswrapper[4775]: I0321 04:47:36.952706 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:37 crc kubenswrapper[4775]: E0321 04:47:37.732804 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:47:37 crc kubenswrapper[4775]: I0321 04:47:37.738972 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:37 crc kubenswrapper[4775]: I0321 04:47:37.739733 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:37 crc kubenswrapper[4775]: I0321 04:47:37.739763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:37 crc kubenswrapper[4775]: I0321 04:47:37.739775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:40 crc kubenswrapper[4775]: I0321 04:47:40.646862 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 21 04:47:40 crc kubenswrapper[4775]: I0321 04:47:40.648540 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:40 crc kubenswrapper[4775]: I0321 04:47:40.650335 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:40 crc kubenswrapper[4775]: I0321 04:47:40.650439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:40 crc kubenswrapper[4775]: I0321 04:47:40.650455 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:41 crc kubenswrapper[4775]: I0321 04:47:41.601178 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.897460 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:47:41 crc kubenswrapper[4775]: W0321 04:47:41.899954 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.900015 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:47:41 crc kubenswrapper[4775]: W0321 04:47:41.900317 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.900451 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.901592 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.904148 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:47:41 crc kubenswrapper[4775]: W0321 04:47:41.905665 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.905747 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:47:41 crc kubenswrapper[4775]: I0321 04:47:41.907206 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:47:41 crc kubenswrapper[4775]: I0321 04:47:41.907276 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.908397 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec1da8a498a5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,LastTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:41 crc kubenswrapper[4775]: W0321 04:47:41.909674 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z Mar 21 04:47:41 crc kubenswrapper[4775]: E0321 04:47:41.909747 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:47:41 crc kubenswrapper[4775]: I0321 04:47:41.913815 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:47:41 crc kubenswrapper[4775]: I0321 04:47:41.913883 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.607019 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:42Z is after 2026-02-23T05:33:13Z Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.755109 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.757734 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c38b09928e8a15368f55622732324802c7af7bc3d13c7cf2d1d00b4ac43fef94" exitCode=255 Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.757790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c38b09928e8a15368f55622732324802c7af7bc3d13c7cf2d1d00b4ac43fef94"} Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.757959 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.758944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.759049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.759139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:42 crc kubenswrapper[4775]: I0321 04:47:42.759735 4775 scope.go:117] "RemoveContainer" containerID="c38b09928e8a15368f55622732324802c7af7bc3d13c7cf2d1d00b4ac43fef94" Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.535717 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.602681 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:43Z is after 2026-02-23T05:33:13Z Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.763683 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.766339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc"} Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.766560 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.768084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.768143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.768155 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:43 crc kubenswrapper[4775]: I0321 04:47:43.771347 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.602748 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:44Z is after 2026-02-23T05:33:13Z Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.770481 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.771214 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.772761 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc" exitCode=255 Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.772901 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc"} Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.773013 4775 scope.go:117] "RemoveContainer" containerID="c38b09928e8a15368f55622732324802c7af7bc3d13c7cf2d1d00b4ac43fef94" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.773245 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.774259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.774368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.774438 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:44 crc kubenswrapper[4775]: I0321 04:47:44.774965 4775 scope.go:117] "RemoveContainer" containerID="a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc" Mar 21 04:47:44 crc kubenswrapper[4775]: E0321 04:47:44.775224 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.022039 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.022198 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.023131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.023160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.023170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.395720 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.395843 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.604565 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:45Z is after 2026-02-23T05:33:13Z Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.778386 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.780553 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.781436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.781478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.781492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:45 crc kubenswrapper[4775]: I0321 04:47:45.782135 4775 scope.go:117] "RemoveContainer" containerID="a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc" Mar 21 04:47:45 crc kubenswrapper[4775]: E0321 04:47:45.782378 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:47:46 crc kubenswrapper[4775]: I0321 04:47:46.602980 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:46Z is after 2026-02-23T05:33:13Z Mar 21 04:47:47 crc kubenswrapper[4775]: I0321 04:47:47.603665 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:47:47Z is after 2026-02-23T05:33:13Z Mar 21 04:47:47 crc kubenswrapper[4775]: E0321 04:47:47.732947 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:47:48 crc kubenswrapper[4775]: I0321 04:47:48.304909 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:48 crc kubenswrapper[4775]: I0321 04:47:48.306047 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:48 crc kubenswrapper[4775]: I0321 04:47:48.306107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:48 crc kubenswrapper[4775]: I0321 04:47:48.306143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:48 crc kubenswrapper[4775]: I0321 04:47:48.306171 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:48 crc kubenswrapper[4775]: E0321 04:47:48.306594 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:47:48 crc kubenswrapper[4775]: E0321 04:47:48.309583 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:47:48 crc kubenswrapper[4775]: I0321 04:47:48.604818 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:48 crc kubenswrapper[4775]: W0321 04:47:48.838603 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:48 crc kubenswrapper[4775]: E0321 04:47:48.838685 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 04:47:48 crc kubenswrapper[4775]: W0321 04:47:48.916321 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 21 04:47:48 crc kubenswrapper[4775]: E0321 04:47:48.916386 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 04:47:49 crc kubenswrapper[4775]: I0321 04:47:49.605353 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:49 crc kubenswrapper[4775]: I0321 04:47:49.703335 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:49 crc kubenswrapper[4775]: I0321 04:47:49.703524 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:49 crc kubenswrapper[4775]: I0321 04:47:49.704621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:49 crc kubenswrapper[4775]: I0321 04:47:49.704656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:49 crc kubenswrapper[4775]: I0321 04:47:49.704667 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:49 crc kubenswrapper[4775]: I0321 04:47:49.705223 4775 scope.go:117] "RemoveContainer" containerID="a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc" Mar 21 04:47:49 crc kubenswrapper[4775]: E0321 04:47:49.705411 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.231415 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.244752 4775 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:47:50 crc kubenswrapper[4775]: W0321 04:47:50.330804 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 21 04:47:50 crc kubenswrapper[4775]: E0321 04:47:50.330913 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.603180 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.685754 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.686074 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.687749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.687802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.687819 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.703753 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.793372 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.794767 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.794841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:50 crc kubenswrapper[4775]: I0321 04:47:50.794865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:51 crc kubenswrapper[4775]: I0321 04:47:51.603800 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.913746 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8a498a5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,LastTimestamp:2026-03-21 04:47:27.597660765 +0000 UTC m=+0.574124389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.919201 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.923644 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.927720 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d84c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,LastTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.931823 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da9215a278 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.728476792 +0000 UTC m=+0.704940426,LastTimestamp:2026-03-21 04:47:27.728476792 +0000 UTC m=+0.704940426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.937371 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d0b56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.762896765 +0000 UTC m=+0.739360399,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.941528 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d6092\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.762922126 +0000 UTC m=+0.739385760,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.945727 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d84c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d84c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,LastTimestamp:2026-03-21 04:47:27.762933036 +0000 UTC m=+0.739396670,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.950468 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d0b56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.763968134 +0000 UTC m=+0.740431768,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.954416 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d6092\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.763987255 +0000 UTC m=+0.740450889,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.960075 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d84c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d84c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,LastTimestamp:2026-03-21 04:47:27.763997745 +0000 UTC m=+0.740461379,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.967466 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d0b56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.766332088 +0000 UTC m=+0.742795722,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.972549 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d6092\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.766348809 +0000 UTC m=+0.742812443,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.977530 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d84c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d84c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,LastTimestamp:2026-03-21 04:47:27.766361109 +0000 UTC m=+0.742824743,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.981955 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d0b56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.767301695 +0000 UTC m=+0.743765319,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.985877 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d6092\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.767318645 +0000 UTC m=+0.743782269,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.990999 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d84c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d84c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,LastTimestamp:2026-03-21 04:47:27.767331535 +0000 UTC m=+0.743795159,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:51 crc kubenswrapper[4775]: E0321 04:47:51.995851 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d0b56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.76748279 +0000 UTC m=+0.743946424,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.000162 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d6092\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.76749861 +0000 UTC m=+0.743962254,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.003824 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d84c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d84c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,LastTimestamp:2026-03-21 04:47:27.76751009 +0000 UTC m=+0.743973724,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.008318 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d0b56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.767684555 +0000 UTC m=+0.744148179,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.012217 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d6092\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.767735226 +0000 UTC m=+0.744198850,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.015648 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d84c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d84c9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653496009 +0000 UTC m=+0.629959633,LastTimestamp:2026-03-21 04:47:27.767761947 +0000 UTC m=+0.744225571,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.020241 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d0b56\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d0b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653464918 +0000 UTC m=+0.629928542,LastTimestamp:2026-03-21 04:47:27.768414995 +0000 UTC m=+0.744878629,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.025396 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec1da8d9d6092\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec1da8d9d6092 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:27.653486738 +0000 UTC m=+0.629950362,LastTimestamp:2026-03-21 04:47:27.768428985 +0000 UTC m=+0.744892619,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.031185 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec1daacc2bcfe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.176028926 +0000 UTC m=+1.152492550,LastTimestamp:2026-03-21 04:47:28.176028926 +0000 UTC m=+1.152492550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.035830 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1daacce7940 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.176798016 +0000 UTC m=+1.153261640,LastTimestamp:2026-03-21 04:47:28.176798016 +0000 UTC m=+1.153261640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.041214 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1daacd022c5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.176906949 +0000 UTC m=+1.153370563,LastTimestamp:2026-03-21 04:47:28.176906949 +0000 UTC m=+1.153370563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.046513 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1daad5d6e4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.186166859 +0000 UTC m=+1.162630483,LastTimestamp:2026-03-21 04:47:28.186166859 +0000 UTC m=+1.162630483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.050496 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1daae2e97d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.199874518 +0000 UTC m=+1.176338142,LastTimestamp:2026-03-21 04:47:28.199874518 +0000 UTC m=+1.176338142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.054752 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1dacee57213 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.748728851 +0000 UTC m=+1.725192475,LastTimestamp:2026-03-21 04:47:28.748728851 +0000 UTC m=+1.725192475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.058463 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dacee6d6ff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.748820223 +0000 UTC m=+1.725283857,LastTimestamp:2026-03-21 04:47:28.748820223 +0000 UTC m=+1.725283857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.061995 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec1dacee7af87 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.748875655 +0000 UTC m=+1.725339279,LastTimestamp:2026-03-21 04:47:28.748875655 +0000 UTC m=+1.725339279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.065286 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dacee9cd06 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.749014278 +0000 UTC m=+1.725477902,LastTimestamp:2026-03-21 04:47:28.749014278 +0000 UTC m=+1.725477902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.069786 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1daceeac6ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.74907825 +0000 UTC m=+1.725541874,LastTimestamp:2026-03-21 04:47:28.74907825 +0000 UTC m=+1.725541874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.073928 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1dacf77de53 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.758324819 +0000 UTC m=+1.734788443,LastTimestamp:2026-03-21 04:47:28.758324819 +0000 UTC m=+1.734788443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.078320 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dacf967b9b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.760331163 +0000 UTC m=+1.736794787,LastTimestamp:2026-03-21 04:47:28.760331163 +0000 UTC m=+1.736794787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.081732 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec1dacf970da0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.760368544 +0000 UTC m=+1.736832168,LastTimestamp:2026-03-21 04:47:28.760368544 +0000 UTC m=+1.736832168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.085430 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dacfa25430 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.761107504 +0000 UTC m=+1.737571118,LastTimestamp:2026-03-21 04:47:28.761107504 +0000 UTC m=+1.737571118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.089058 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1dacfa91ce0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.761552096 +0000 UTC m=+1.738015720,LastTimestamp:2026-03-21 04:47:28.761552096 +0000 UTC m=+1.738015720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.092907 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dacfa9571f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.761567007 +0000 UTC m=+1.738030631,LastTimestamp:2026-03-21 04:47:28.761567007 +0000 UTC m=+1.738030631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.097656 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dae0e03574 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.05037554 +0000 UTC m=+2.026839174,LastTimestamp:2026-03-21 04:47:29.05037554 +0000 UTC m=+2.026839174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.101987 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dae17e77a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.06074717 +0000 UTC m=+2.037210794,LastTimestamp:2026-03-21 04:47:29.06074717 +0000 UTC m=+2.037210794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.106051 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dae18aadf7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.061547511 +0000 UTC m=+2.038011135,LastTimestamp:2026-03-21 04:47:29.061547511 +0000 UTC m=+2.038011135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.109590 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1daeb936f02 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.229893378 +0000 UTC m=+2.206357002,LastTimestamp:2026-03-21 04:47:29.229893378 +0000 UTC m=+2.206357002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.113079 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1daecc8d0ad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.250169005 +0000 UTC m=+2.226632629,LastTimestamp:2026-03-21 04:47:29.250169005 +0000 UTC m=+2.226632629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.116724 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1daecd5c6ee openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.251018478 +0000 UTC m=+2.227482102,LastTimestamp:2026-03-21 04:47:29.251018478 +0000 UTC m=+2.227482102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.120040 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1daf5c7fe7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.401110142 +0000 UTC m=+2.377573766,LastTimestamp:2026-03-21 04:47:29.401110142 +0000 UTC m=+2.377573766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.123343 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1daf6ee5492 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.420399762 +0000 UTC m=+2.396863406,LastTimestamp:2026-03-21 04:47:29.420399762 +0000 UTC m=+2.396863406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.126754 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db061f1364 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.67525258 +0000 UTC m=+2.651716204,LastTimestamp:2026-03-21 04:47:29.67525258 +0000 UTC m=+2.651716204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.131068 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db063fcf08 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.677397768 +0000 UTC m=+2.653861392,LastTimestamp:2026-03-21 04:47:29.677397768 +0000 UTC m=+2.653861392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.134538 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db067f1426 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.68154423 +0000 UTC m=+2.658007864,LastTimestamp:2026-03-21 04:47:29.68154423 +0000 UTC m=+2.658007864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.138813 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec1db06a2ac81 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.683876993 +0000 UTC m=+2.660340617,LastTimestamp:2026-03-21 04:47:29.683876993 +0000 UTC m=+2.660340617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.143360 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db1343f91e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.895774494 +0000 UTC m=+2.872238118,LastTimestamp:2026-03-21 04:47:29.895774494 +0000 UTC m=+2.872238118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.147283 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db13582ab5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.897097909 +0000 UTC m=+2.873561553,LastTimestamp:2026-03-21 04:47:29.897097909 +0000 UTC m=+2.873561553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.152642 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db135ff505 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.897608453 +0000 UTC m=+2.874072077,LastTimestamp:2026-03-21 04:47:29.897608453 +0000 UTC m=+2.874072077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.156849 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec1db13612f49 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.897688905 +0000 UTC m=+2.874152529,LastTimestamp:2026-03-21 04:47:29.897688905 +0000 UTC m=+2.874152529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.162143 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec1db14ad2120 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.919443232 +0000 UTC m=+2.895906856,LastTimestamp:2026-03-21 04:47:29.919443232 +0000 UTC m=+2.895906856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.166053 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db14b0aa12 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.919674898 +0000 UTC m=+2.896138522,LastTimestamp:2026-03-21 04:47:29.919674898 +0000 UTC m=+2.896138522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.169539 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db14bf85ba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.920648634 +0000 UTC m=+2.897112258,LastTimestamp:2026-03-21 04:47:29.920648634 +0000 UTC m=+2.897112258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.172816 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db157d2ad9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.933077209 +0000 UTC m=+2.909540833,LastTimestamp:2026-03-21 04:47:29.933077209 +0000 UTC m=+2.909540833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.176072 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db1594a611 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.934616081 +0000 UTC m=+2.911079705,LastTimestamp:2026-03-21 04:47:29.934616081 +0000 UTC m=+2.911079705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.180325 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db161c65b4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.9435125 +0000 UTC m=+2.919976124,LastTimestamp:2026-03-21 04:47:29.9435125 +0000 UTC m=+2.919976124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.185546 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db2725c9bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.229340604 +0000 UTC m=+3.205804228,LastTimestamp:2026-03-21 04:47:30.229340604 +0000 UTC m=+3.205804228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.189076 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db27874301 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.235728641 +0000 UTC m=+3.212192265,LastTimestamp:2026-03-21 04:47:30.235728641 +0000 UTC m=+3.212192265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.192648 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db27da0019 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.241151001 +0000 UTC m=+3.217614625,LastTimestamp:2026-03-21 04:47:30.241151001 +0000 UTC m=+3.217614625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.196692 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db27e4bf4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.24185531 +0000 UTC m=+3.218318934,LastTimestamp:2026-03-21 04:47:30.24185531 +0000 UTC m=+3.218318934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.201011 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db28641015 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.250199061 +0000 UTC m=+3.226662685,LastTimestamp:2026-03-21 04:47:30.250199061 +0000 UTC m=+3.226662685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.205229 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db28740eb0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.25124728 +0000 UTC m=+3.227710984,LastTimestamp:2026-03-21 04:47:30.25124728 +0000 UTC m=+3.227710984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.208686 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db31ce6f4f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.408165199 +0000 UTC m=+3.384628823,LastTimestamp:2026-03-21 04:47:30.408165199 +0000 UTC m=+3.384628823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.213298 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db32028d01 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.411580673 +0000 UTC m=+3.388044307,LastTimestamp:2026-03-21 04:47:30.411580673 +0000 UTC m=+3.388044307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.216839 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db336753ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.43496241 +0000 UTC m=+3.411426034,LastTimestamp:2026-03-21 04:47:30.43496241 +0000 UTC m=+3.411426034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.221211 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db33783e77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.436071031 +0000 UTC m=+3.412534655,LastTimestamp:2026-03-21 04:47:30.436071031 +0000 UTC m=+3.412534655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.225008 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec1db33dfb810 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.442852368 +0000 UTC m=+3.419315992,LastTimestamp:2026-03-21 04:47:30.442852368 +0000 UTC m=+3.419315992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.229482 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db3d0ae7a5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.596677541 +0000 UTC m=+3.573141165,LastTimestamp:2026-03-21 04:47:30.596677541 +0000 UTC m=+3.573141165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.233536 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db3dd9c610 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.610234896 +0000 UTC m=+3.586698530,LastTimestamp:2026-03-21 04:47:30.610234896 +0000 UTC m=+3.586698530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.237801 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db3deb28d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.611374288 +0000 UTC m=+3.587837922,LastTimestamp:2026-03-21 04:47:30.611374288 +0000 UTC m=+3.587837922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.241884 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db42d02fd1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.693492689 +0000 UTC m=+3.669956313,LastTimestamp:2026-03-21 04:47:30.693492689 +0000 UTC m=+3.669956313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.246407 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db46e308fe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.761836798 +0000 UTC m=+3.738300422,LastTimestamp:2026-03-21 04:47:30.761836798 +0000 UTC m=+3.738300422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.249872 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db47f0ca4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.779515467 +0000 UTC m=+3.755979091,LastTimestamp:2026-03-21 04:47:30.779515467 +0000 UTC m=+3.755979091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.253939 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db4d4456c6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.868876998 +0000 UTC m=+3.845340622,LastTimestamp:2026-03-21 04:47:30.868876998 +0000 UTC m=+3.845340622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.257025 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db4e2f08c9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.884257993 +0000 UTC m=+3.860721617,LastTimestamp:2026-03-21 04:47:30.884257993 +0000 UTC m=+3.860721617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.261343 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db802c518e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:31.722940814 +0000 UTC m=+4.699404478,LastTimestamp:2026-03-21 04:47:31.722940814 +0000 UTC m=+4.699404478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.264808 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db8d05e23b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:31.938525755 +0000 UTC m=+4.914989379,LastTimestamp:2026-03-21 04:47:31.938525755 +0000 UTC m=+4.914989379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.268244 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db8db7d854 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:31.950188628 +0000 UTC m=+4.926652262,LastTimestamp:2026-03-21 04:47:31.950188628 +0000 UTC m=+4.926652262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.271728 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db8dc59971 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:31.951090033 +0000 UTC m=+4.927553657,LastTimestamp:2026-03-21 04:47:31.951090033 +0000 UTC m=+4.927553657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.275078 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db9b63a4bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.17955142 +0000 UTC m=+5.156015074,LastTimestamp:2026-03-21 04:47:32.17955142 +0000 UTC m=+5.156015074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.278661 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db9c4f3799 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.194989977 +0000 UTC m=+5.171453641,LastTimestamp:2026-03-21 04:47:32.194989977 +0000 UTC m=+5.171453641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.282135 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1db9c76cd0b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.197584139 +0000 UTC m=+5.174047763,LastTimestamp:2026-03-21 04:47:32.197584139 +0000 UTC m=+5.174047763,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.285663 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dba7d409b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.388243891 +0000 UTC m=+5.364707515,LastTimestamp:2026-03-21 04:47:32.388243891 +0000 UTC m=+5.364707515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.289336 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dba8dd36e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.405622501 +0000 UTC m=+5.382086135,LastTimestamp:2026-03-21 04:47:32.405622501 +0000 UTC m=+5.382086135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.292285 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dba8eebe21 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.406771233 +0000 UTC m=+5.383234857,LastTimestamp:2026-03-21 04:47:32.406771233 +0000 UTC m=+5.383234857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.295877 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dbb335a68b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.579190411 +0000 UTC m=+5.555654035,LastTimestamp:2026-03-21 04:47:32.579190411 +0000 UTC m=+5.555654035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.299535 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dbb3e4e968 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.590676328 +0000 UTC m=+5.567139942,LastTimestamp:2026-03-21 04:47:32.590676328 +0000 UTC m=+5.567139942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.303406 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dbb3f41170 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.591669616 +0000 UTC m=+5.568133240,LastTimestamp:2026-03-21 04:47:32.591669616 +0000 UTC m=+5.568133240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.307403 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dbbccfeddb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.740296155 +0000 UTC m=+5.716759799,LastTimestamp:2026-03-21 04:47:32.740296155 +0000 UTC m=+5.716759799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.311288 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec1dbbda7e3a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:32.754449317 +0000 UTC m=+5.730912941,LastTimestamp:2026-03-21 04:47:32.754449317 +0000 UTC m=+5.730912941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.315352 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:47:52 crc kubenswrapper[4775]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b06d854 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:47:52 crc kubenswrapper[4775]: body: Mar 21 04:47:52 crc kubenswrapper[4775]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394695252 +0000 UTC m=+8.371158876,LastTimestamp:2026-03-21 04:47:35.394695252 +0000 UTC m=+8.371158876,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:47:52 crc kubenswrapper[4775]: > Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.318705 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b07b0b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394750644 +0000 UTC m=+8.371214268,LastTimestamp:2026-03-21 04:47:35.394750644 +0000 UTC m=+8.371214268,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.323617 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:47:52 crc kubenswrapper[4775]: &Event{ObjectMeta:{kube-apiserver-crc.189ec1dddf349906 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 04:47:52 crc kubenswrapper[4775]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:47:52 crc kubenswrapper[4775]: Mar 21 04:47:52 crc kubenswrapper[4775]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:41.90725351 +0000 UTC m=+14.883717124,LastTimestamp:2026-03-21 04:47:41.90725351 +0000 UTC m=+14.883717124,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:47:52 crc kubenswrapper[4775]: > Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.329193 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1dddf360eb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:41.907349173 +0000 UTC m=+14.883812817,LastTimestamp:2026-03-21 04:47:41.907349173 +0000 UTC m=+14.883812817,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.333415 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec1dddf349906\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:47:52 crc kubenswrapper[4775]: &Event{ObjectMeta:{kube-apiserver-crc.189ec1dddf349906 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 04:47:52 crc kubenswrapper[4775]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:47:52 crc kubenswrapper[4775]: Mar 21 04:47:52 crc kubenswrapper[4775]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:41.90725351 +0000 UTC m=+14.883717124,LastTimestamp:2026-03-21 04:47:41.913863893 +0000 UTC m=+14.890327517,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:47:52 crc kubenswrapper[4775]: > Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.336409 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec1dddf360eb5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1dddf360eb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:41.907349173 +0000 UTC m=+14.883812817,LastTimestamp:2026-03-21 04:47:41.913906864 +0000 UTC m=+14.890370488,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.339650 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec1db3deb28d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db3deb28d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.611374288 +0000 UTC m=+3.587837922,LastTimestamp:2026-03-21 04:47:42.760975897 +0000 UTC m=+15.737439521,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.344178 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec1db46e308fe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db46e308fe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.761836798 +0000 UTC m=+3.738300422,LastTimestamp:2026-03-21 04:47:43.003957656 +0000 UTC m=+15.980421280,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.349706 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec1db47f0ca4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec1db47f0ca4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:30.779515467 +0000 UTC m=+3.755979091,LastTimestamp:2026-03-21 04:47:43.012821501 +0000 UTC m=+15.989285125,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.355530 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dc5b06d854\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:47:52 crc kubenswrapper[4775]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b06d854 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:47:52 crc kubenswrapper[4775]: body: Mar 21 04:47:52 crc kubenswrapper[4775]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394695252 +0000 UTC m=+8.371158876,LastTimestamp:2026-03-21 04:47:45.395807312 +0000 UTC m=+18.372270936,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:47:52 crc kubenswrapper[4775]: > Mar 21 04:47:52 crc kubenswrapper[4775]: E0321 04:47:52.360306 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dc5b07b0b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b07b0b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394750644 +0000 UTC m=+8.371214268,LastTimestamp:2026-03-21 04:47:45.395884935 +0000 UTC m=+18.372348559,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:52 crc kubenswrapper[4775]: I0321 04:47:52.604354 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:53 crc kubenswrapper[4775]: I0321 04:47:53.132924 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:47:53 crc kubenswrapper[4775]: I0321 04:47:53.133191 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:53 crc kubenswrapper[4775]: I0321 04:47:53.134492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:53 crc kubenswrapper[4775]: I0321 04:47:53.134556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:53 crc kubenswrapper[4775]: I0321 04:47:53.134569 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:53 crc kubenswrapper[4775]: I0321 04:47:53.135108 4775 scope.go:117] "RemoveContainer" containerID="a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc" Mar 21 04:47:53 crc kubenswrapper[4775]: E0321 04:47:53.135286 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:47:53 crc kubenswrapper[4775]: W0321 04:47:53.150994 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 21 04:47:53 crc kubenswrapper[4775]: E0321 04:47:53.151077 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 04:47:53 crc kubenswrapper[4775]: I0321 04:47:53.604900 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:54 crc kubenswrapper[4775]: I0321 04:47:54.603991 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.309748 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.310926 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.310981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.311012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.311047 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.311443 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.315278 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.394830 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.394931 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.395090 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.395265 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.396258 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.396296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.396305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.396733 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.396874 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4" gracePeriod=30 Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.399958 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dc5b06d854\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:47:55 crc kubenswrapper[4775]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b06d854 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:47:55 crc kubenswrapper[4775]: body: Mar 21 04:47:55 crc kubenswrapper[4775]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394695252 +0000 UTC m=+8.371158876,LastTimestamp:2026-03-21 04:47:55.394900916 +0000 UTC m=+28.371364540,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:47:55 crc kubenswrapper[4775]: > Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.405483 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dc5b07b0b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b07b0b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394750644 +0000 UTC m=+8.371214268,LastTimestamp:2026-03-21 04:47:55.39503712 +0000 UTC m=+28.371500744,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.411406 4775 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1e1033fad92 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:55.39686133 +0000 UTC m=+28.373324954,LastTimestamp:2026-03-21 04:47:55.39686133 +0000 UTC m=+28.373324954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.514533 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dacfa25430\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dacfa25430 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:28.761107504 +0000 UTC m=+1.737571118,LastTimestamp:2026-03-21 04:47:55.509932247 +0000 UTC m=+28.486395871,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.603339 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.649655 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dae0e03574\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dae0e03574 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.05037554 +0000 UTC m=+2.026839174,LastTimestamp:2026-03-21 04:47:55.64386567 +0000 UTC m=+28.620329294,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:55 crc kubenswrapper[4775]: E0321 04:47:55.657308 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dae17e77a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dae17e77a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:29.06074717 +0000 UTC m=+2.037210794,LastTimestamp:2026-03-21 04:47:55.65324367 +0000 UTC m=+28.629707294,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.806405 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.806978 4775 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4" exitCode=255 Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.807092 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4"} Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.807227 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e5615480a5b452c6bce19fa04738bb5d68984f62258c9486fc2ae787cfb9b7fd"} Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.807612 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.808880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.809042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:55 crc kubenswrapper[4775]: I0321 04:47:55.809165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:56 crc kubenswrapper[4775]: I0321 04:47:56.604302 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:56 crc kubenswrapper[4775]: I0321 04:47:56.953707 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:47:56 crc kubenswrapper[4775]: I0321 04:47:56.953862 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:47:56 crc kubenswrapper[4775]: I0321 04:47:56.955044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:47:56 crc kubenswrapper[4775]: I0321 04:47:56.955094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:47:56 crc kubenswrapper[4775]: I0321 04:47:56.955111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:47:57 crc kubenswrapper[4775]: I0321 04:47:57.603909 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:57 crc kubenswrapper[4775]: E0321 04:47:57.733148 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:47:58 crc kubenswrapper[4775]: I0321 04:47:58.604429 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:47:59 crc kubenswrapper[4775]: I0321 04:47:59.604515 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:00 crc kubenswrapper[4775]: I0321 04:48:00.604840 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:01 crc kubenswrapper[4775]: I0321 04:48:01.604936 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.315384 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:02 crc kubenswrapper[4775]: E0321 04:48:02.316444 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.317354 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.317508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.317535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.317576 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:48:02 crc kubenswrapper[4775]: E0321 04:48:02.321710 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.395049 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.395363 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.397034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.397099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.397145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:02 crc kubenswrapper[4775]: I0321 04:48:02.604446 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:03 crc kubenswrapper[4775]: I0321 04:48:03.603847 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:03 crc kubenswrapper[4775]: W0321 04:48:03.699067 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 21 04:48:03 crc kubenswrapper[4775]: E0321 04:48:03.699181 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 04:48:04 crc kubenswrapper[4775]: I0321 04:48:04.604973 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:05 crc kubenswrapper[4775]: W0321 04:48:05.394185 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:05 crc kubenswrapper[4775]: E0321 04:48:05.394437 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 21 04:48:05 crc kubenswrapper[4775]: I0321 04:48:05.395224 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:48:05 crc kubenswrapper[4775]: I0321 04:48:05.395279 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:48:05 crc kubenswrapper[4775]: E0321 04:48:05.399568 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dc5b06d854\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:48:05 crc kubenswrapper[4775]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b06d854 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:48:05 crc kubenswrapper[4775]: body: Mar 21 04:48:05 crc kubenswrapper[4775]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394695252 +0000 UTC m=+8.371158876,LastTimestamp:2026-03-21 04:48:05.395264485 +0000 UTC m=+38.371728109,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:48:05 crc kubenswrapper[4775]: > Mar 21 04:48:05 crc kubenswrapper[4775]: E0321 04:48:05.404506 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dc5b07b0b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b07b0b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394750644 +0000 UTC m=+8.371214268,LastTimestamp:2026-03-21 04:48:05.395302146 +0000 UTC m=+38.371765770,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:48:05 crc kubenswrapper[4775]: I0321 04:48:05.603817 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:06 crc kubenswrapper[4775]: I0321 04:48:06.612697 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:06 crc kubenswrapper[4775]: W0321 04:48:06.876411 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 21 04:48:06 crc kubenswrapper[4775]: E0321 04:48:06.876471 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 04:48:07 crc kubenswrapper[4775]: I0321 04:48:07.606217 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:07 crc kubenswrapper[4775]: E0321 04:48:07.733238 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.604887 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.661312 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.662691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.662744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.662755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.663396 4775 scope.go:117] "RemoveContainer" containerID="a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.843239 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.844976 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b"} Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.845146 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.845848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.845886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:08 crc kubenswrapper[4775]: I0321 04:48:08.845897 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:09 crc kubenswrapper[4775]: E0321 04:48:09.320933 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.321816 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.322931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.322956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.322967 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.322999 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:48:09 crc kubenswrapper[4775]: E0321 04:48:09.327953 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.606142 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.849071 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.849532 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.851024 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b" exitCode=255 Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.851067 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b"} Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.851105 4775 scope.go:117] "RemoveContainer" containerID="a3ec38c653682ae3ceeecec7bf83cf11591c8de9989f8ce7f4b2ce4297b763cc" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.851254 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.852323 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.852377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.852391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:09 crc kubenswrapper[4775]: I0321 04:48:09.853061 4775 scope.go:117] "RemoveContainer" containerID="9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b" Mar 21 04:48:09 crc kubenswrapper[4775]: E0321 04:48:09.853273 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:10 crc kubenswrapper[4775]: I0321 04:48:10.605832 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:10 crc kubenswrapper[4775]: I0321 04:48:10.855238 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:48:11 crc kubenswrapper[4775]: I0321 04:48:11.605577 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:12 crc kubenswrapper[4775]: I0321 04:48:12.604992 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:13 crc kubenswrapper[4775]: I0321 04:48:13.133029 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:48:13 crc kubenswrapper[4775]: I0321 04:48:13.133243 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:13 crc kubenswrapper[4775]: I0321 04:48:13.134257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:13 crc kubenswrapper[4775]: I0321 04:48:13.134290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:13 crc kubenswrapper[4775]: I0321 04:48:13.134301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:13 crc kubenswrapper[4775]: I0321 04:48:13.134858 4775 scope.go:117] "RemoveContainer" containerID="9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b" Mar 21 04:48:13 crc kubenswrapper[4775]: E0321 04:48:13.135027 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:14 crc kubenswrapper[4775]: I0321 04:48:14.479557 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:14 crc kubenswrapper[4775]: I0321 04:48:14.601250 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:48:14 crc kubenswrapper[4775]: I0321 04:48:14.601456 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:14 crc kubenswrapper[4775]: I0321 04:48:14.603111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:14 crc kubenswrapper[4775]: I0321 04:48:14.603199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:14 crc kubenswrapper[4775]: I0321 04:48:14.603217 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:14 crc kubenswrapper[4775]: I0321 04:48:14.603771 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:15 crc kubenswrapper[4775]: I0321 04:48:15.395861 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:48:15 crc kubenswrapper[4775]: I0321 04:48:15.395953 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:48:15 crc kubenswrapper[4775]: E0321 04:48:15.401418 4775 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec1dc5b06d854\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:48:15 crc kubenswrapper[4775]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec1dc5b06d854 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:48:15 crc kubenswrapper[4775]: body: Mar 21 04:48:15 crc kubenswrapper[4775]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:47:35.394695252 +0000 UTC m=+8.371158876,LastTimestamp:2026-03-21 04:48:15.395933439 +0000 UTC m=+48.372397063,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:48:15 crc kubenswrapper[4775]: > Mar 21 04:48:15 crc kubenswrapper[4775]: I0321 04:48:15.604758 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:16 crc kubenswrapper[4775]: E0321 04:48:16.326083 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:48:16 crc kubenswrapper[4775]: I0321 04:48:16.328043 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:16 crc kubenswrapper[4775]: I0321 04:48:16.329350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:16 crc kubenswrapper[4775]: I0321 04:48:16.329393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:16 crc kubenswrapper[4775]: I0321 04:48:16.329405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:16 crc kubenswrapper[4775]: I0321 04:48:16.329444 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:48:16 crc kubenswrapper[4775]: E0321 04:48:16.334435 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:48:16 crc kubenswrapper[4775]: I0321 04:48:16.604455 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:17 crc kubenswrapper[4775]: I0321 04:48:17.604343 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:17 crc kubenswrapper[4775]: E0321 04:48:17.734161 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:48:18 crc kubenswrapper[4775]: W0321 04:48:18.499460 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 21 04:48:18 crc kubenswrapper[4775]: E0321 04:48:18.499572 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 04:48:18 crc kubenswrapper[4775]: I0321 04:48:18.605554 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:19 crc kubenswrapper[4775]: I0321 04:48:19.606686 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:19 crc kubenswrapper[4775]: I0321 04:48:19.703372 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:48:19 crc kubenswrapper[4775]: I0321 04:48:19.703652 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:19 crc kubenswrapper[4775]: I0321 04:48:19.704723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:19 crc kubenswrapper[4775]: I0321 04:48:19.704777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:19 crc kubenswrapper[4775]: I0321 04:48:19.704788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:19 crc kubenswrapper[4775]: I0321 04:48:19.705390 4775 scope.go:117] "RemoveContainer" containerID="9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b" Mar 21 04:48:19 crc kubenswrapper[4775]: E0321 04:48:19.705600 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:20 crc kubenswrapper[4775]: I0321 04:48:20.606309 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:21 crc kubenswrapper[4775]: I0321 04:48:21.604533 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:22 crc kubenswrapper[4775]: I0321 04:48:22.606210 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:23 crc kubenswrapper[4775]: E0321 04:48:23.331724 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:48:23 crc kubenswrapper[4775]: I0321 04:48:23.334808 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:23 crc kubenswrapper[4775]: I0321 04:48:23.336159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:23 crc kubenswrapper[4775]: I0321 04:48:23.336198 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:23 crc kubenswrapper[4775]: I0321 04:48:23.336210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:23 crc kubenswrapper[4775]: I0321 04:48:23.336242 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:48:23 crc kubenswrapper[4775]: E0321 04:48:23.340838 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:48:23 crc kubenswrapper[4775]: I0321 04:48:23.605046 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.020623 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.020796 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.021928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.021965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.021980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.030050 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.500851 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.502649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.502701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.502716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:24 crc kubenswrapper[4775]: I0321 04:48:24.606522 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:25 crc kubenswrapper[4775]: I0321 04:48:25.604663 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:26 crc kubenswrapper[4775]: I0321 04:48:26.606049 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:27 crc kubenswrapper[4775]: I0321 04:48:27.604100 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:27 crc kubenswrapper[4775]: E0321 04:48:27.734928 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:48:28 crc kubenswrapper[4775]: I0321 04:48:28.603948 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:29 crc kubenswrapper[4775]: I0321 04:48:29.604368 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:30 crc kubenswrapper[4775]: E0321 04:48:30.337587 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:48:30 crc kubenswrapper[4775]: I0321 04:48:30.341728 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:30 crc kubenswrapper[4775]: I0321 04:48:30.343087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:30 crc kubenswrapper[4775]: I0321 04:48:30.343131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:30 crc kubenswrapper[4775]: I0321 04:48:30.343144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:30 crc kubenswrapper[4775]: I0321 04:48:30.343173 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:48:30 crc kubenswrapper[4775]: E0321 04:48:30.347421 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:48:30 crc kubenswrapper[4775]: I0321 04:48:30.607360 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:31 crc kubenswrapper[4775]: I0321 04:48:31.604555 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.113865 4775 csr.go:261] certificate signing request csr-mkg6s is approved, waiting to be issued Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.122827 4775 csr.go:257] certificate signing request csr-mkg6s is issued Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.150515 4775 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.461312 4775 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.660611 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.661577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.661608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.661616 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:32 crc kubenswrapper[4775]: I0321 04:48:32.662216 4775 scope.go:117] "RemoveContainer" containerID="9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.123990 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-28 22:05:43.809376382 +0000 UTC Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.124358 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6065h17m10.685022258s for next certificate rotation Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.521226 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.521934 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.523657 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" exitCode=255 Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.523705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4"} Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.523743 4775 scope.go:117] "RemoveContainer" containerID="9756a73583a81b5cd100849e9bf3b7b2cbe36f2d69afc978be7f77672bb9a36b" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.523945 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.525663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.525687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.525697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:33 crc kubenswrapper[4775]: I0321 04:48:33.526244 4775 scope.go:117] "RemoveContainer" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" Mar 21 04:48:33 crc kubenswrapper[4775]: E0321 04:48:33.526446 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:34 crc kubenswrapper[4775]: I0321 04:48:34.527601 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.347866 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.349096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.349162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.349175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.349296 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.356491 4775 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.356772 4775 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.356788 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.359812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.359854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.359866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.359880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.359891 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:37Z","lastTransitionTime":"2026-03-21T04:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.371436 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.377665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.377714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.377729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.377749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.377763 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:37Z","lastTransitionTime":"2026-03-21T04:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.386748 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.392929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.392965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.392976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.392992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.393003 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:37Z","lastTransitionTime":"2026-03-21T04:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.402375 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.412379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.412439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.412452 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.412471 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:37 crc kubenswrapper[4775]: I0321 04:48:37.412487 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:37Z","lastTransitionTime":"2026-03-21T04:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.422003 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.422136 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.422160 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.522655 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.623268 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.723920 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.736389 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.824988 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:37 crc kubenswrapper[4775]: E0321 04:48:37.925411 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.026430 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.126611 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.227565 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.328895 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.429860 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.530761 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.631470 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.732188 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.832764 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:38 crc kubenswrapper[4775]: E0321 04:48:38.933500 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.033873 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.134248 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.235338 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.335715 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.435849 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.536404 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.636994 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: I0321 04:48:39.702811 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:48:39 crc kubenswrapper[4775]: I0321 04:48:39.702962 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:39 crc kubenswrapper[4775]: I0321 04:48:39.703948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:39 crc kubenswrapper[4775]: I0321 04:48:39.703987 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:39 crc kubenswrapper[4775]: I0321 04:48:39.704002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:39 crc kubenswrapper[4775]: I0321 04:48:39.704550 4775 scope.go:117] "RemoveContainer" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.704725 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.737745 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.838095 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:39 crc kubenswrapper[4775]: E0321 04:48:39.938924 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.039263 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: I0321 04:48:40.069040 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.139541 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.239664 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.340525 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.441135 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.541226 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.641900 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.742538 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.843650 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: E0321 04:48:40.944060 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:40 crc kubenswrapper[4775]: I0321 04:48:40.981169 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.044785 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.145242 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.246345 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.346849 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.447865 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.548502 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.648797 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.749170 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.849424 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:41 crc kubenswrapper[4775]: E0321 04:48:41.950548 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.051185 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.151811 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.252663 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.353595 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.453895 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.554570 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.654841 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.755147 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.855597 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:42 crc kubenswrapper[4775]: E0321 04:48:42.955974 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.056152 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: I0321 04:48:43.133581 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:48:43 crc kubenswrapper[4775]: I0321 04:48:43.134110 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:43 crc kubenswrapper[4775]: I0321 04:48:43.135444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:43 crc kubenswrapper[4775]: I0321 04:48:43.135481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:43 crc kubenswrapper[4775]: I0321 04:48:43.135494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:43 crc kubenswrapper[4775]: I0321 04:48:43.136167 4775 scope.go:117] "RemoveContainer" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.136383 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.156625 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.257410 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.357829 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.458694 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.559102 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.660331 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.761228 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.861445 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:43 crc kubenswrapper[4775]: E0321 04:48:43.962225 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.063432 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.164523 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.265291 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.366034 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.466966 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.568178 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.668539 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.769379 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.870196 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:44 crc kubenswrapper[4775]: E0321 04:48:44.971027 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.072202 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.172586 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.273416 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.374417 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.475585 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.576060 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: I0321 04:48:45.660778 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:48:45 crc kubenswrapper[4775]: I0321 04:48:45.662010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:45 crc kubenswrapper[4775]: I0321 04:48:45.662057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:45 crc kubenswrapper[4775]: I0321 04:48:45.662082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.676610 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.776721 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.877796 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:45 crc kubenswrapper[4775]: E0321 04:48:45.978930 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.079611 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.180409 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.281287 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.382284 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.483367 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.584352 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.685461 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.785888 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.887019 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:46 crc kubenswrapper[4775]: E0321 04:48:46.987401 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.087595 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.188643 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.288928 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.389789 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.490665 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.590945 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.691599 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.737315 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.743936 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.748415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.748454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.748463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.748476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.748485 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:47Z","lastTransitionTime":"2026-03-21T04:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.759554 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.763774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.763813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.763827 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.763844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.763855 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:47Z","lastTransitionTime":"2026-03-21T04:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.773735 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.791737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.791783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.791791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.791806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.791815 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:47Z","lastTransitionTime":"2026-03-21T04:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.800679 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.804204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.804342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.804433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.804566 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:47 crc kubenswrapper[4775]: I0321 04:48:47.804659 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:47Z","lastTransitionTime":"2026-03-21T04:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.814398 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.814513 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.814535 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:47 crc kubenswrapper[4775]: E0321 04:48:47.915204 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.016411 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.117154 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.218318 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.318783 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.419828 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.520593 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.621559 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.722149 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.822762 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:48 crc kubenswrapper[4775]: E0321 04:48:48.923700 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.024053 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.124744 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.225331 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.326247 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.427232 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.528045 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.629416 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.730217 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.830950 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:49 crc kubenswrapper[4775]: E0321 04:48:49.931608 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.031894 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.132440 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.232807 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.333583 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.434610 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.535427 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.635980 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.737106 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.837566 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:50 crc kubenswrapper[4775]: E0321 04:48:50.938578 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:51 crc kubenswrapper[4775]: E0321 04:48:51.039752 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:51 crc kubenswrapper[4775]: E0321 04:48:51.140402 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:51 crc kubenswrapper[4775]: E0321 04:48:51.240744 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:51 crc kubenswrapper[4775]: E0321 04:48:51.341026 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:51 crc kubenswrapper[4775]: E0321 04:48:51.441871 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:51 crc kubenswrapper[4775]: E0321 04:48:51.542502 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.571107 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.645777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.645829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.645846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.645864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.645875 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:51Z","lastTransitionTime":"2026-03-21T04:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.748182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.748221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.748231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.748245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.748254 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:51Z","lastTransitionTime":"2026-03-21T04:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.850569 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.850611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.850622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.850638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.850649 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:51Z","lastTransitionTime":"2026-03-21T04:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.953665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.953696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.953704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.953717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:51 crc kubenswrapper[4775]: I0321 04:48:51.953725 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:51Z","lastTransitionTime":"2026-03-21T04:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.056236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.056273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.056283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.056297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.056305 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.158448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.158476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.158485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.158498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.158506 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.261102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.261175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.261192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.261210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.261237 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.363575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.363604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.363617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.363632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.363641 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.466207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.466257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.466273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.466290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.466300 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.499068 4775 apiserver.go:52] "Watching apiserver" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.505059 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.505355 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.505665 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.505772 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.505908 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.505943 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.506058 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.506104 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.506185 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.506259 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.506385 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.508555 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.508926 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.509145 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.510255 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.510399 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.510458 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.510399 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.510814 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.510833 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.533991 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.549673 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.562030 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.568942 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.568982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.568992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.569008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.569021 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.575840 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.590139 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.603438 4775 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.606605 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.624539 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.639625 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.650606 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.661135 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.671198 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.671234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.671242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.671256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.671265 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.671314 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690605 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690624 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690665 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690683 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690705 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690726 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690742 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690803 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690824 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690848 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690869 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690889 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690908 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690940 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690978 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.690994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691033 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691051 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691069 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691088 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691129 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691150 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691167 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691207 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691232 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691251 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691374 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.694030 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.696464 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691438 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691986 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692108 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.691180 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692341 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692359 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692464 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692346 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692636 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692813 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692829 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.696506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.696983 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697020 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697047 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697066 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697090 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697160 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697230 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697276 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697330 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697399 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697431 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697463 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697489 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.692930 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693093 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693103 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697643 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693156 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693188 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693246 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693352 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693593 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693657 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693959 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693982 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.693989 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.694282 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.694446 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.695064 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.698146 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.698205 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699168 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699516 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.697883 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699680 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699710 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699737 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699745 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699768 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699805 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699769 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699894 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699930 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699962 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699990 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700029 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700058 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700085 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700142 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700172 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700199 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700226 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.699961 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700253 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700422 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700535 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700632 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700919 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.700328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701216 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701234 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701252 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701270 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701287 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701305 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701321 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701339 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701356 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701374 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701411 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701429 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701445 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701462 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701479 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701497 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701513 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701530 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701547 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701563 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701579 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701596 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701629 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701657 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701700 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701725 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701752 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701773 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701793 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701809 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701825 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701843 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701859 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701895 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701913 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701929 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701944 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701983 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.701999 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702015 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702048 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702063 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702369 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702388 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702415 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702441 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702601 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702902 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.702468 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703258 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703297 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703326 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703379 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703405 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703429 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703452 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703502 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703549 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703576 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703632 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703659 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703796 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703817 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703835 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703851 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703872 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703892 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703910 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703926 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703944 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703964 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703987 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.704011 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.704030 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.704052 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.704079 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.704099 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.704325 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703439 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703557 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703620 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703750 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703870 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.703968 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.704306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.704322 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:48:53.204296336 +0000 UTC m=+86.180759960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705391 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705439 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705468 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705500 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705523 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705551 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705578 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705597 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705614 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705632 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705650 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705667 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705685 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705752 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705772 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705789 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706069 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706094 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706111 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706245 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706273 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706291 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706355 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706372 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706389 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706407 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707901 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707945 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708091 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708134 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708185 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708240 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708268 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708346 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708673 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708748 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708775 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708880 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708907 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708966 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709025 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709055 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709083 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710328 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710357 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710374 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710389 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710404 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710417 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710432 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710461 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710477 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710491 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710504 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710518 4775 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710531 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710543 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710556 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710571 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710585 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710599 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713176 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713386 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713404 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713419 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713619 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713653 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713673 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713691 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713706 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713728 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713742 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713757 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713772 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713784 4775 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713800 4775 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713814 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713825 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713837 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713850 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713864 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713876 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713889 4775 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713903 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713914 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713927 4775 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713940 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713952 4775 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713977 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713990 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714005 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714017 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714033 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714046 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714058 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714068 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714078 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714091 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714106 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714135 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714146 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714159 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.711786 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714793 4775 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706593 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705094 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705098 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706927 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706955 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.706995 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707378 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707548 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707745 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707884 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.707969 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708365 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708386 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.708614 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709637 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709704 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709779 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709785 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710080 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710186 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710324 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.709801 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710549 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710595 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710736 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.710858 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.711020 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.711008 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.711490 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.711634 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.711789 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.705090 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.711848 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712460 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712617 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712662 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712747 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712766 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712966 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.712959 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713176 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713248 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713297 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713452 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713520 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713611 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713636 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713649 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.713906 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714193 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714221 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714248 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.714333 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.715407 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.715460 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714448 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714685 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714686 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.714866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.715841 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.716067 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.716094 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.716530 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.716789 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.716833 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.716541 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:53.216518781 +0000 UTC m=+86.192982405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.717925 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:53.21787165 +0000 UTC m=+86.194335314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.718020 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717128 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717134 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717190 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717234 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717404 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717444 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717443 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717591 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.717553 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.718650 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.719000 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.719260 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.719289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.722841 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.724850 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.726748 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.726777 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.726793 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.726859 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:53.226838103 +0000 UTC m=+86.203301917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.729724 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.730584 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.730986 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.731406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.731476 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.731515 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.731633 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.731656 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.731724 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:53.23170075 +0000 UTC m=+86.208164554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.732082 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.732181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.732952 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.733189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.734354 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.735696 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.738245 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.739584 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.739605 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.740536 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741044 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741070 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741104 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741050 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741474 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741589 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741693 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.741824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.742225 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.743370 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.743544 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.743759 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.743979 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.744903 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.744898 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745082 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745083 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745156 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745315 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745405 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745636 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745681 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.745965 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746203 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746270 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746669 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746741 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746801 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746922 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.746972 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.748190 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.748600 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.749071 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.749280 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.761353 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.768688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.769211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.770552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.773290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.773317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.773343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.773362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.773378 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814771 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814846 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814861 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814888 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814924 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814938 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814947 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814956 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814965 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814976 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.814996 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815008 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815020 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815030 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815041 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815049 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815057 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815065 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815073 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815083 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815091 4775 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815100 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815108 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815146 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815160 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815169 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815178 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815186 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815194 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815202 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815210 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815218 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815228 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815236 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815243 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815251 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815261 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815269 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815277 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815285 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815293 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815301 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815308 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815316 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815324 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815332 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815339 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815347 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815355 4775 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815362 4775 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815370 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815377 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815385 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815393 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815401 4775 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815409 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815418 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815425 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815433 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815441 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815450 4775 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815459 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815471 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815483 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815492 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815501 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815512 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815522 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815530 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815538 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815546 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815556 4775 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815565 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815575 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815584 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815592 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815601 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815610 4775 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815619 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815628 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815637 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815680 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815688 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815697 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815706 4775 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815715 4775 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815723 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815731 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815740 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815749 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815758 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815769 4775 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815787 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815800 4775 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815810 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815822 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815832 4775 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815843 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815855 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815865 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815877 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815888 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815897 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815905 4775 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815913 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815922 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815930 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815939 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815947 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815955 4775 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815963 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815972 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815982 4775 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.815992 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816001 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816011 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816023 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816032 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816042 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816054 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816065 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816074 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816082 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816090 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816097 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816105 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816133 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816142 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816150 4775 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816159 4775 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816166 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816173 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816181 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816189 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816196 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816204 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.816213 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.819412 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.825239 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.834617 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:48:52 crc kubenswrapper[4775]: W0321 04:48:52.837030 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-b1c7b006c84367f3a4790ee0ff97d47b510e4b6095d1cb4f1ca0271198041409 WatchSource:0}: Error finding container b1c7b006c84367f3a4790ee0ff97d47b510e4b6095d1cb4f1ca0271198041409: Status 404 returned error can't find the container with id b1c7b006c84367f3a4790ee0ff97d47b510e4b6095d1cb4f1ca0271198041409 Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.842273 4775 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:48:52 crc kubenswrapper[4775]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:48:52 crc kubenswrapper[4775]: if [[ -f "/env/_master" ]]; then Mar 21 04:48:52 crc kubenswrapper[4775]: set -o allexport Mar 21 04:48:52 crc kubenswrapper[4775]: source "/env/_master" Mar 21 04:48:52 crc kubenswrapper[4775]: set +o allexport Mar 21 04:48:52 crc kubenswrapper[4775]: fi Mar 21 04:48:52 crc kubenswrapper[4775]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:48:52 crc kubenswrapper[4775]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:48:52 crc kubenswrapper[4775]: ho_enable="--enable-hybrid-overlay" Mar 21 04:48:52 crc kubenswrapper[4775]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:48:52 crc kubenswrapper[4775]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:48:52 crc kubenswrapper[4775]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:48:52 crc kubenswrapper[4775]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:48:52 crc kubenswrapper[4775]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:48:52 crc kubenswrapper[4775]: --webhook-host=127.0.0.1 \ Mar 21 04:48:52 crc kubenswrapper[4775]: --webhook-port=9743 \ Mar 21 04:48:52 crc kubenswrapper[4775]: ${ho_enable} \ Mar 21 04:48:52 crc kubenswrapper[4775]: --enable-interconnect \ Mar 21 04:48:52 crc kubenswrapper[4775]: --disable-approver \ Mar 21 04:48:52 crc kubenswrapper[4775]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:48:52 crc kubenswrapper[4775]: --wait-for-kubernetes-api=200s \ Mar 21 04:48:52 crc kubenswrapper[4775]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:48:52 crc kubenswrapper[4775]: --loglevel="${LOGLEVEL}" Mar 21 04:48:52 crc kubenswrapper[4775]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:48:52 crc kubenswrapper[4775]: > logger="UnhandledError" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.842383 4775 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:48:52 crc kubenswrapper[4775]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:48:52 crc kubenswrapper[4775]: set -o allexport Mar 21 04:48:52 crc kubenswrapper[4775]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:48:52 crc kubenswrapper[4775]: source /etc/kubernetes/apiserver-url.env Mar 21 04:48:52 crc kubenswrapper[4775]: else Mar 21 04:48:52 crc kubenswrapper[4775]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:48:52 crc kubenswrapper[4775]: exit 1 Mar 21 04:48:52 crc kubenswrapper[4775]: fi Mar 21 04:48:52 crc kubenswrapper[4775]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:48:52 crc kubenswrapper[4775]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:48:52 crc kubenswrapper[4775]: > logger="UnhandledError" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.843497 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.844575 4775 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:48:52 crc kubenswrapper[4775]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:48:52 crc kubenswrapper[4775]: if [[ -f "/env/_master" ]]; then Mar 21 04:48:52 crc kubenswrapper[4775]: set -o allexport Mar 21 04:48:52 crc kubenswrapper[4775]: source "/env/_master" Mar 21 04:48:52 crc kubenswrapper[4775]: set +o allexport Mar 21 04:48:52 crc kubenswrapper[4775]: fi Mar 21 04:48:52 crc kubenswrapper[4775]: Mar 21 04:48:52 crc kubenswrapper[4775]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:48:52 crc kubenswrapper[4775]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:48:52 crc kubenswrapper[4775]: --disable-webhook \ Mar 21 04:48:52 crc kubenswrapper[4775]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:48:52 crc kubenswrapper[4775]: --loglevel="${LOGLEVEL}" Mar 21 04:48:52 crc kubenswrapper[4775]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:48:52 crc kubenswrapper[4775]: > logger="UnhandledError" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.845792 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:48:52 crc kubenswrapper[4775]: W0321 04:48:52.850251 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-14d8bf0b6d75a02942a6a502beb9579e7f6b9f1c6b1eb223e7d7c6b60d3a4fd4 WatchSource:0}: Error finding container 14d8bf0b6d75a02942a6a502beb9579e7f6b9f1c6b1eb223e7d7c6b60d3a4fd4: Status 404 returned error can't find the container with id 14d8bf0b6d75a02942a6a502beb9579e7f6b9f1c6b1eb223e7d7c6b60d3a4fd4 Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.852940 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:48:52 crc kubenswrapper[4775]: E0321 04:48:52.854111 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.875352 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.875399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.875408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.875439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.875448 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.977665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.977712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.977740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.977761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:52 crc kubenswrapper[4775]: I0321 04:48:52.977772 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:52Z","lastTransitionTime":"2026-03-21T04:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.080715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.080765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.080777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.080795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.080806 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.182747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.182793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.182806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.182824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.182837 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.218359 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.218519 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:48:54.218490122 +0000 UTC m=+87.194953746 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.218578 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.218614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.218721 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.218789 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:54.21876851 +0000 UTC m=+87.195232174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.218722 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.218823 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:54.218817331 +0000 UTC m=+87.195280955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.285583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.285632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.285644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.285660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.285673 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.319473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.319516 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319616 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319636 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319646 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319694 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:54.319681409 +0000 UTC m=+87.296145033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319708 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319744 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319757 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.319821 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:54.319798252 +0000 UTC m=+87.296261916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.387870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.387915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.387957 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.387974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.387986 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.490239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.490297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.490306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.490321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.490331 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.576717 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"af3c6cc594d3df6f8481abbdfcef717c4a3b5771cfa7847295f4044b5982be48"} Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.577972 4775 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:48:53 crc kubenswrapper[4775]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:48:53 crc kubenswrapper[4775]: set -o allexport Mar 21 04:48:53 crc kubenswrapper[4775]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:48:53 crc kubenswrapper[4775]: source /etc/kubernetes/apiserver-url.env Mar 21 04:48:53 crc kubenswrapper[4775]: else Mar 21 04:48:53 crc kubenswrapper[4775]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:48:53 crc kubenswrapper[4775]: exit 1 Mar 21 04:48:53 crc kubenswrapper[4775]: fi Mar 21 04:48:53 crc kubenswrapper[4775]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:48:53 crc kubenswrapper[4775]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:48:53 crc kubenswrapper[4775]: > logger="UnhandledError" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.578919 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b1c7b006c84367f3a4790ee0ff97d47b510e4b6095d1cb4f1ca0271198041409"} Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.579209 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.580212 4775 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:48:53 crc kubenswrapper[4775]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:48:53 crc kubenswrapper[4775]: if [[ -f "/env/_master" ]]; then Mar 21 04:48:53 crc kubenswrapper[4775]: set -o allexport Mar 21 04:48:53 crc kubenswrapper[4775]: source "/env/_master" Mar 21 04:48:53 crc kubenswrapper[4775]: set +o allexport Mar 21 04:48:53 crc kubenswrapper[4775]: fi Mar 21 04:48:53 crc kubenswrapper[4775]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:48:53 crc kubenswrapper[4775]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:48:53 crc kubenswrapper[4775]: ho_enable="--enable-hybrid-overlay" Mar 21 04:48:53 crc kubenswrapper[4775]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:48:53 crc kubenswrapper[4775]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:48:53 crc kubenswrapper[4775]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:48:53 crc kubenswrapper[4775]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:48:53 crc kubenswrapper[4775]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:48:53 crc kubenswrapper[4775]: --webhook-host=127.0.0.1 \ Mar 21 04:48:53 crc kubenswrapper[4775]: --webhook-port=9743 \ Mar 21 04:48:53 crc kubenswrapper[4775]: ${ho_enable} \ Mar 21 04:48:53 crc kubenswrapper[4775]: --enable-interconnect \ Mar 21 04:48:53 crc kubenswrapper[4775]: --disable-approver \ Mar 21 04:48:53 crc kubenswrapper[4775]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:48:53 crc kubenswrapper[4775]: --wait-for-kubernetes-api=200s \ Mar 21 04:48:53 crc kubenswrapper[4775]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:48:53 crc kubenswrapper[4775]: --loglevel="${LOGLEVEL}" Mar 21 04:48:53 crc kubenswrapper[4775]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:48:53 crc kubenswrapper[4775]: > logger="UnhandledError" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.580430 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"14d8bf0b6d75a02942a6a502beb9579e7f6b9f1c6b1eb223e7d7c6b60d3a4fd4"} Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.581438 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.582585 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.582824 4775 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:48:53 crc kubenswrapper[4775]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:48:53 crc kubenswrapper[4775]: if [[ -f "/env/_master" ]]; then Mar 21 04:48:53 crc kubenswrapper[4775]: set -o allexport Mar 21 04:48:53 crc kubenswrapper[4775]: source "/env/_master" Mar 21 04:48:53 crc kubenswrapper[4775]: set +o allexport Mar 21 04:48:53 crc kubenswrapper[4775]: fi Mar 21 04:48:53 crc kubenswrapper[4775]: Mar 21 04:48:53 crc kubenswrapper[4775]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:48:53 crc kubenswrapper[4775]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:48:53 crc kubenswrapper[4775]: --disable-webhook \ Mar 21 04:48:53 crc kubenswrapper[4775]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:48:53 crc kubenswrapper[4775]: --loglevel="${LOGLEVEL}" Mar 21 04:48:53 crc kubenswrapper[4775]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:48:53 crc kubenswrapper[4775]: > logger="UnhandledError" Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.583986 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.587383 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.592580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.592630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.592645 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.592662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.592673 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.598339 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.609054 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.620316 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.632365 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.642970 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.655857 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.660530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:53 crc kubenswrapper[4775]: E0321 04:48:53.660701 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.665416 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.666140 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.667710 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.670320 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.671818 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.672527 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.677383 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.678052 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.679473 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.680166 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.680568 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.680787 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.682088 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.682770 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.683886 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.684536 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.685752 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.686736 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.687411 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.688742 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.689663 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.690093 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.691526 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.692318 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.692833 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.695479 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.695948 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.697345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.697398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.697413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.697431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.697441 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.697617 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.698537 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.699576 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.700306 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.701419 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.701933 4775 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.702051 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.702535 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.704511 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.705195 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.705645 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.707347 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.708465 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.709025 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.710543 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.711281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.711825 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.712780 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.714046 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.715279 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.715998 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.717058 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.717722 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.718767 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.719634 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.720609 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.721096 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.721779 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.722213 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.722939 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.723544 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.724443 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.724929 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.799224 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.799285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.799299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.799317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.799658 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.901429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.901506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.901547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.901562 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:53 crc kubenswrapper[4775]: I0321 04:48:53.901571 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:53Z","lastTransitionTime":"2026-03-21T04:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.003588 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.003642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.003653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.003670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.003682 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.109488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.109540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.109553 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.109573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.109585 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.212311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.212385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.212398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.212412 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.212422 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.228472 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.228553 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.228577 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.228678 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.228733 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:56.228717581 +0000 UTC m=+89.205181205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.229102 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:48:56.229090502 +0000 UTC m=+89.205554136 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.229201 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.229238 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:56.229227515 +0000 UTC m=+89.205691139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.315075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.315150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.315162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.315180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.315192 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.329575 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.329620 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329725 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329739 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329782 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329818 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329855 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329869 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329830 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:56.329817185 +0000 UTC m=+89.306280809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.329941 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:48:56.329922488 +0000 UTC m=+89.306386182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.417336 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.417385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.417398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.417415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.417430 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.417950 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.519523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.519564 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.519574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.519609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.519622 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.622023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.622183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.622199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.622222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.622236 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.661623 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.661764 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.661973 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:48:54 crc kubenswrapper[4775]: E0321 04:48:54.661783 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.724768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.724809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.724820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.724835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.724847 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.828282 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.828318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.828327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.828340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.828349 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.931153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.931196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.931209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.931226 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:54 crc kubenswrapper[4775]: I0321 04:48:54.931238 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:54Z","lastTransitionTime":"2026-03-21T04:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.033027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.033078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.033090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.033142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.033162 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.135593 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.135630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.135642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.135658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.135669 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.238249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.238299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.238311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.238329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.238341 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.340930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.340961 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.340969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.340982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.340991 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.442684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.442713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.442721 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.442733 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.442743 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.544925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.545018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.545031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.545073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.545083 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.647784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.647848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.647863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.647901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.647915 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.661480 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:55 crc kubenswrapper[4775]: E0321 04:48:55.661716 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.750443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.750499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.750512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.750528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.750539 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.853267 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.853302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.853311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.853324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.853332 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.955827 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.955862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.955872 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.955885 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:55 crc kubenswrapper[4775]: I0321 04:48:55.955894 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:55Z","lastTransitionTime":"2026-03-21T04:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.057634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.057674 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.057685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.057700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.057710 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.159595 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.159636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.159645 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.159658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.159667 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.244779 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.244861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.244885 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.244963 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.245008 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:00.24499518 +0000 UTC m=+93.221458804 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.245411 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.245506 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:49:00.245472393 +0000 UTC m=+93.221936017 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.245535 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:00.245528615 +0000 UTC m=+93.221992239 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.262080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.262311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.262385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.262464 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.262532 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.346109 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.346186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346314 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346336 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346347 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346402 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:00.346384882 +0000 UTC m=+93.322848506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346314 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346767 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346839 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.346930 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:00.346919267 +0000 UTC m=+93.323382891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.364882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.364917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.364929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.364946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.364967 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.467162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.467215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.467229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.467245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.467256 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.569064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.569102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.569139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.569155 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.569165 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.660823 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.660867 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.660977 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:48:56 crc kubenswrapper[4775]: E0321 04:48:56.661362 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.671495 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.671522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.671530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.671544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.671553 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.773619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.773670 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.773683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.773708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.773725 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.876050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.876086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.876095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.876109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.876131 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.981311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.981361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.981479 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.982053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:56 crc kubenswrapper[4775]: I0321 04:48:56.982138 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:56Z","lastTransitionTime":"2026-03-21T04:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.084691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.084729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.084737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.084750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.084759 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.187534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.187573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.187582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.187595 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.187605 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.289792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.289868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.289882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.289898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.289908 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.392340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.392955 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.392970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.392986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.392996 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.494740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.494783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.494792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.494807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.494816 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.596361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.596400 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.596411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.596430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.596442 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.660515 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:57 crc kubenswrapper[4775]: E0321 04:48:57.660667 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.683349 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.692911 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.698474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.698641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.698744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.698830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.698929 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.703090 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.712645 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.721437 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.731583 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.742358 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.801149 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.801199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.801214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.801236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.801251 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.882454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.882760 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.882845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.882935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.883022 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: E0321 04:48:57.892829 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.896695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.896838 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.896930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.897031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.897151 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: E0321 04:48:57.907399 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.911359 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.911404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.911413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.911428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.911437 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: E0321 04:48:57.921532 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.926044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.926094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.926104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.926164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.926176 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: E0321 04:48:57.936495 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.939700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.939822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.939920 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.940044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.940182 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:57 crc kubenswrapper[4775]: E0321 04:48:57.950910 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:48:57 crc kubenswrapper[4775]: E0321 04:48:57.951349 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.952875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.952907 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.952919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.952934 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:57 crc kubenswrapper[4775]: I0321 04:48:57.952947 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:57Z","lastTransitionTime":"2026-03-21T04:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.055023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.055276 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.055388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.055484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.055571 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.158227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.158517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.158589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.158655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.158713 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.261944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.261984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.261996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.262010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.262021 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.365239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.365282 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.365292 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.365309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.365320 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.467824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.468138 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.468215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.468316 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.468401 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.570795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.571057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.571194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.571321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.571413 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.660938 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:48:58 crc kubenswrapper[4775]: E0321 04:48:58.661318 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.660956 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:48:58 crc kubenswrapper[4775]: E0321 04:48:58.661666 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.670472 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.670941 4775 scope.go:117] "RemoveContainer" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" Mar 21 04:48:58 crc kubenswrapper[4775]: E0321 04:48:58.671300 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.674012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.674042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.674053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.674076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.674088 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.776102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.776206 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.776222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.776237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.776273 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.879634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.879888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.879958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.880040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.880151 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.982535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.982843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.983023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.983234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:58 crc kubenswrapper[4775]: I0321 04:48:58.983404 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:58Z","lastTransitionTime":"2026-03-21T04:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.085605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.085676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.085689 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.085705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.085716 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.188177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.188219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.188232 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.188249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.188261 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.290294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.290336 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.290346 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.290384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.290400 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.392147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.392261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.392279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.392300 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.392311 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.494312 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.494353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.494364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.494379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.494389 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.594821 4775 scope.go:117] "RemoveContainer" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" Mar 21 04:48:59 crc kubenswrapper[4775]: E0321 04:48:59.595008 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.596423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.596458 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.596468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.596482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.596491 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.660731 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:48:59 crc kubenswrapper[4775]: E0321 04:48:59.660845 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.699108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.699167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.699175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.699189 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.699198 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.801240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.801273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.801283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.801295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.801304 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.905637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.905691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.905701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.905716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:48:59 crc kubenswrapper[4775]: I0321 04:48:59.905728 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:48:59Z","lastTransitionTime":"2026-03-21T04:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.008434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.008485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.008498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.008516 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.008529 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.110968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.111106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.111154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.111173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.111185 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.214179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.214214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.214224 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.214240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.214252 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.278619 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.278703 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.278734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.278849 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.278870 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:49:08.278838594 +0000 UTC m=+101.255302218 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.278907 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:08.278897886 +0000 UTC m=+101.255361620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.278988 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.279071 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:08.27904949 +0000 UTC m=+101.255513194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.316896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.316925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.316933 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.316945 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.316954 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.379930 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.379976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380093 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380139 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380152 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380195 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:08.380180415 +0000 UTC m=+101.356644029 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380512 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380532 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380543 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.380570 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:08.380563396 +0000 UTC m=+101.357027020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.425344 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.425407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.425420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.425440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.425455 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.528358 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.528392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.528401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.528414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.528424 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.631677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.631723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.631734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.631753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.631764 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.661282 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.661361 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.661402 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:00 crc kubenswrapper[4775]: E0321 04:49:00.661494 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.734348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.734386 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.734397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.734410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.734421 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.836590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.836661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.836675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.836694 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.836708 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.938448 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.938492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.938502 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.938517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:00 crc kubenswrapper[4775]: I0321 04:49:00.938526 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:00Z","lastTransitionTime":"2026-03-21T04:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.041938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.041978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.041987 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.042001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.042009 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.145653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.145704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.145719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.145736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.145749 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.247391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.247421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.247431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.247443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.247452 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.349752 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.349818 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.349830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.349847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.349858 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.452659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.452706 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.452718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.452736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.452754 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.555172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.555221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.555237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.555256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.555268 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.657872 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.657917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.657928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.657944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.657955 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.661302 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:01 crc kubenswrapper[4775]: E0321 04:49:01.661462 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.759619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.759660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.759678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.759697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.759709 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.834066 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-khh7x"] Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.834406 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.837191 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.837196 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.838596 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.845043 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.856672 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.861284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.861322 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.861334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.861350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.861362 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.867356 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.877665 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.887087 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.896715 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.901136 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwtgh\" (UniqueName: \"kubernetes.io/projected/70cca5ae-9de4-4933-a6f8-4a23ab711bbf-kube-api-access-bwtgh\") pod \"node-resolver-khh7x\" (UID: \"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\") " pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.901180 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70cca5ae-9de4-4933-a6f8-4a23ab711bbf-hosts-file\") pod \"node-resolver-khh7x\" (UID: \"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\") " pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.911218 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.921109 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.926647 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.964915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.964964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.964980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.964997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:01 crc kubenswrapper[4775]: I0321 04:49:01.965010 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:01Z","lastTransitionTime":"2026-03-21T04:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.001953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70cca5ae-9de4-4933-a6f8-4a23ab711bbf-hosts-file\") pod \"node-resolver-khh7x\" (UID: \"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\") " pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.002029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwtgh\" (UniqueName: \"kubernetes.io/projected/70cca5ae-9de4-4933-a6f8-4a23ab711bbf-kube-api-access-bwtgh\") pod \"node-resolver-khh7x\" (UID: \"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\") " pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.002179 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70cca5ae-9de4-4933-a6f8-4a23ab711bbf-hosts-file\") pod \"node-resolver-khh7x\" (UID: \"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\") " pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.020576 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwtgh\" (UniqueName: \"kubernetes.io/projected/70cca5ae-9de4-4933-a6f8-4a23ab711bbf-kube-api-access-bwtgh\") pod \"node-resolver-khh7x\" (UID: \"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\") " pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.067278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.067316 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.067325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.067338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.067346 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.149246 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qc7hn"] Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.150621 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-khh7x" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.150782 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.151384 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-556rg"] Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.151639 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kldzh"] Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.151918 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.152141 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.155257 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.155528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.155854 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.155988 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.156335 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.156584 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.156640 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.156861 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.157013 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.157484 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.157672 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.157796 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.170975 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.172446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.172477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.172490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.172505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.172516 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.178375 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.192042 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.199887 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204720 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204771 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-conf-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204793 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffcf487-ef41-4395-81eb-e5e6358f4a32-proxy-tls\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-system-cni-dir\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204833 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e77ec218-42da-4f07-b214-184c4f3b20f3-cni-binary-copy\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204870 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cffcf487-ef41-4395-81eb-e5e6358f4a32-rootfs\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204889 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-socket-dir-parent\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204909 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vft22\" (UniqueName: \"kubernetes.io/projected/e77ec218-42da-4f07-b214-184c4f3b20f3-kube-api-access-vft22\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204929 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-etc-kubernetes\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204965 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/957cde70-ca20-438a-a4bf-42481dddb2db-cni-binary-copy\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.204985 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-hostroot\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205004 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pxr5\" (UniqueName: \"kubernetes.io/projected/cffcf487-ef41-4395-81eb-e5e6358f4a32-kube-api-access-9pxr5\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205024 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-multus-certs\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-cni-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205062 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-os-release\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205080 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-netns\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205100 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-cni-multus\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205141 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-kubelet\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-k8s-cni-cncf-io\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205183 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-cni-bin\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-cnibin\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-daemon-config\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205247 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqnl6\" (UniqueName: \"kubernetes.io/projected/957cde70-ca20-438a-a4bf-42481dddb2db-kube-api-access-sqnl6\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205266 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-system-cni-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205287 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffcf487-ef41-4395-81eb-e5e6358f4a32-mcd-auth-proxy-config\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205309 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-cnibin\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205330 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-os-release\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.205350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/957cde70-ca20-438a-a4bf-42481dddb2db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.208996 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.217595 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.227568 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.235150 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.246814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.259688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.274781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.274816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.274827 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.274843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.274852 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.284446 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.294551 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.305827 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.306428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cffcf487-ef41-4395-81eb-e5e6358f4a32-rootfs\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.306521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-socket-dir-parent\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.307236 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vft22\" (UniqueName: \"kubernetes.io/projected/e77ec218-42da-4f07-b214-184c4f3b20f3-kube-api-access-vft22\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.306521 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cffcf487-ef41-4395-81eb-e5e6358f4a32-rootfs\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.306669 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-socket-dir-parent\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.307517 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-etc-kubernetes\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.307559 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/957cde70-ca20-438a-a4bf-42481dddb2db-cni-binary-copy\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.307606 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-hostroot\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.307633 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-etc-kubernetes\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.307921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-hostroot\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.308369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pxr5\" (UniqueName: \"kubernetes.io/projected/cffcf487-ef41-4395-81eb-e5e6358f4a32-kube-api-access-9pxr5\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.308523 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-multus-certs\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309343 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-multus-certs\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-cni-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309410 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-os-release\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-netns\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309455 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-cni-multus\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309494 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-cni-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309516 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-kubelet\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309539 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-k8s-cni-cncf-io\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309559 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-cni-bin\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309581 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-cnibin\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-daemon-config\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309605 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-kubelet\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqnl6\" (UniqueName: \"kubernetes.io/projected/957cde70-ca20-438a-a4bf-42481dddb2db-kube-api-access-sqnl6\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309629 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-k8s-cni-cncf-io\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309682 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-system-cni-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-run-netns\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffcf487-ef41-4395-81eb-e5e6358f4a32-mcd-auth-proxy-config\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309758 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-cnibin\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309781 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-cnibin\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309813 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-cnibin\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-os-release\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310073 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-os-release\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309695 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-os-release\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309866 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-system-cni-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.309989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/957cde70-ca20-438a-a4bf-42481dddb2db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310289 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310312 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-conf-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffcf487-ef41-4395-81eb-e5e6358f4a32-proxy-tls\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-system-cni-dir\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e77ec218-42da-4f07-b214-184c4f3b20f3-cni-binary-copy\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310575 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cffcf487-ef41-4395-81eb-e5e6358f4a32-mcd-auth-proxy-config\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310632 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-daemon-config\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310689 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-multus-conf-dir\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.310983 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-system-cni-dir\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.311010 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e77ec218-42da-4f07-b214-184c4f3b20f3-cni-binary-copy\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.311301 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-cni-bin\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.311345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e77ec218-42da-4f07-b214-184c4f3b20f3-host-var-lib-cni-multus\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.312423 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/957cde70-ca20-438a-a4bf-42481dddb2db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.318320 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/957cde70-ca20-438a-a4bf-42481dddb2db-cni-binary-copy\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.318726 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cffcf487-ef41-4395-81eb-e5e6358f4a32-proxy-tls\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.322291 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/957cde70-ca20-438a-a4bf-42481dddb2db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.322389 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.325240 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vft22\" (UniqueName: \"kubernetes.io/projected/e77ec218-42da-4f07-b214-184c4f3b20f3-kube-api-access-vft22\") pod \"multus-556rg\" (UID: \"e77ec218-42da-4f07-b214-184c4f3b20f3\") " pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.325427 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pxr5\" (UniqueName: \"kubernetes.io/projected/cffcf487-ef41-4395-81eb-e5e6358f4a32-kube-api-access-9pxr5\") pod \"machine-config-daemon-qc7hn\" (UID: \"cffcf487-ef41-4395-81eb-e5e6358f4a32\") " pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.328981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqnl6\" (UniqueName: \"kubernetes.io/projected/957cde70-ca20-438a-a4bf-42481dddb2db-kube-api-access-sqnl6\") pod \"multus-additional-cni-plugins-kldzh\" (UID: \"957cde70-ca20-438a-a4bf-42481dddb2db\") " pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.334345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.345617 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.353836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.383548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.383592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.383604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.383620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.383630 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.389187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.400658 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.412190 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.423753 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.433251 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.480907 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.488088 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-556rg" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.495732 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kldzh" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.497916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.497937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.497946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.497960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.497971 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.520440 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mzqtk"] Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.521225 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: W0321 04:49:02.523433 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod957cde70_ca20_438a_a4bf_42481dddb2db.slice/crio-301da03471c41946481eae1a4d5aba941d0f6e54e94da5ce2e7db5cc0688799d WatchSource:0}: Error finding container 301da03471c41946481eae1a4d5aba941d0f6e54e94da5ce2e7db5cc0688799d: Status 404 returned error can't find the container with id 301da03471c41946481eae1a4d5aba941d0f6e54e94da5ce2e7db5cc0688799d Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.524035 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.524186 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.524330 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.524557 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.525107 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.525242 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.532017 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.537452 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.547185 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.566805 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.577952 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.586648 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.595977 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.603555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.605155 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.605182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.605297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.605310 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.605516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerStarted","Data":"301da03471c41946481eae1a4d5aba941d0f6e54e94da5ce2e7db5cc0688799d"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.606501 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.607841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"7ef2a3b6220db4bc8aca03b1b3bd662eba9eefcf7c61a83b758d2c0438f4c19b"} Mar 21 04:49:02 crc kubenswrapper[4775]: W0321 04:49:02.616546 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode77ec218_42da_4f07_b214_184c4f3b20f3.slice/crio-a4b14fb323638d862a85c09bb4e48af7322efaab141d2c6f6e3a94ca0a74466d WatchSource:0}: Error finding container a4b14fb323638d862a85c09bb4e48af7322efaab141d2c6f6e3a94ca0a74466d: Status 404 returned error can't find the container with id a4b14fb323638d862a85c09bb4e48af7322efaab141d2c6f6e3a94ca0a74466d Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-khh7x" event={"ID":"70cca5ae-9de4-4933-a6f8-4a23ab711bbf","Type":"ContainerStarted","Data":"6e58e7b588aafb219cf072c2bd4bc4bbc5b8413034949e8002754ff3bb657735"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-var-lib-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616899 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-etc-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616919 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-node-log\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-netd\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616961 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-log-socket\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-bin\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.616990 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-kubelet\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617030 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-ovn\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617056 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69d31f5-deeb-4860-be96-ed5547831685-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617101 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617143 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-systemd-units\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617162 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-config\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617271 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-script-lib\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7w6b\" (UniqueName: \"kubernetes.io/projected/a69d31f5-deeb-4860-be96-ed5547831685-kube-api-access-h7w6b\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617330 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-netns\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-env-overrides\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617363 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-systemd\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617377 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-slash\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.617434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.629216 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.646836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.657508 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.660358 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:02 crc kubenswrapper[4775]: E0321 04:49:02.660443 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.660708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:02 crc kubenswrapper[4775]: E0321 04:49:02.660759 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.666501 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.675452 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.710735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.710776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.710788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.710805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.710818 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7w6b\" (UniqueName: \"kubernetes.io/projected/a69d31f5-deeb-4860-be96-ed5547831685-kube-api-access-h7w6b\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718786 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-config\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-script-lib\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718818 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-env-overrides\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718863 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-netns\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718880 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-systemd\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718895 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-slash\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718913 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-var-lib-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718933 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-etc-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718964 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-node-log\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.718979 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-netd\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719000 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-log-socket\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719014 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-bin\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-kubelet\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-ovn\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719087 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69d31f5-deeb-4860-be96-ed5547831685-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719148 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-systemd-units\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.719210 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-systemd-units\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720212 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-config\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720395 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-node-log\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720438 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-var-lib-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720469 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-etc-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720531 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-openvswitch\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720562 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-netns\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-slash\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720789 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-netd\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720812 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-bin\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720845 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-kubelet\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720884 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720769 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-log-socket\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.720979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-ovn\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.721005 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-env-overrides\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.721433 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-systemd\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.721658 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-script-lib\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.723875 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69d31f5-deeb-4860-be96-ed5547831685-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.737743 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7w6b\" (UniqueName: \"kubernetes.io/projected/a69d31f5-deeb-4860-be96-ed5547831685-kube-api-access-h7w6b\") pod \"ovnkube-node-mzqtk\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.813715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.813757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.813768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.813782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.813792 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.833344 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:02 crc kubenswrapper[4775]: W0321 04:49:02.851664 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda69d31f5_deeb_4860_be96_ed5547831685.slice/crio-a053283b74d1089f3887c9fbc93ee3063db45e363837d9c9231a5cf45771e07b WatchSource:0}: Error finding container a053283b74d1089f3887c9fbc93ee3063db45e363837d9c9231a5cf45771e07b: Status 404 returned error can't find the container with id a053283b74d1089f3887c9fbc93ee3063db45e363837d9c9231a5cf45771e07b Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.916735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.916772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.916784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.916800 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:02 crc kubenswrapper[4775]: I0321 04:49:02.916812 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:02Z","lastTransitionTime":"2026-03-21T04:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.019604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.019661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.019672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.019691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.019703 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.122133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.122185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.122198 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.122214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.122225 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.225289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.225337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.225348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.225395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.225410 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.327441 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.327672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.327799 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.328076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.328238 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.431081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.431362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.431490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.431621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.431689 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.534346 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.534709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.534817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.534906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.534984 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.620768 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6" exitCode=0 Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.620839 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.620866 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"a053283b74d1089f3887c9fbc93ee3063db45e363837d9c9231a5cf45771e07b"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.622828 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerStarted","Data":"77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.622913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerStarted","Data":"a4b14fb323638d862a85c09bb4e48af7322efaab141d2c6f6e3a94ca0a74466d"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.624217 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerStarted","Data":"10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.626478 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.626601 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.628579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-khh7x" event={"ID":"70cca5ae-9de4-4933-a6f8-4a23ab711bbf","Type":"ContainerStarted","Data":"cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.637795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.637840 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.637850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.637864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.637874 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.646414 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.656547 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.660329 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:03 crc kubenswrapper[4775]: E0321 04:49:03.660450 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.677416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.687660 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.697367 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.705277 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.714212 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.722351 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.733814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.740623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.740665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.740674 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.740692 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.740702 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.743443 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.752552 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.760602 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.767798 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.774526 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.784747 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.797094 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.808017 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.818654 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.828443 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.844663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.844699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.844709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.844722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.844731 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.850502 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.866938 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.888883 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.904162 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.915881 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.926184 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.938498 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.947171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.947199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.947207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.947220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:03 crc kubenswrapper[4775]: I0321 04:49:03.947228 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:03Z","lastTransitionTime":"2026-03-21T04:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.049852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.049891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.049900 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.049915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.049928 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.153719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.153768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.153778 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.153794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.153806 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.256143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.256191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.256202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.256219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.256232 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.359620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.359677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.359686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.359700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.359710 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.462248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.462279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.462289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.462302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.462310 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.564652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.564682 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.564691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.564703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.564714 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.632368 4775 generic.go:334] "Generic (PLEG): container finished" podID="957cde70-ca20-438a-a4bf-42481dddb2db" containerID="10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18" exitCode=0 Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.632987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerDied","Data":"10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.638085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.638153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.638169 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.649951 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.660916 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.660926 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:04 crc kubenswrapper[4775]: E0321 04:49:04.661011 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:04 crc kubenswrapper[4775]: E0321 04:49:04.661072 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.667269 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.667781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.667809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.667820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.667836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.667848 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.675450 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.683054 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kr988"] Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.683435 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.685585 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.685837 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.686080 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.686180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.686239 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.693662 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.705061 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.714169 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.723659 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.732888 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.741637 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.756827 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.766682 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.770006 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.770031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.770039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.770053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.770061 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.778192 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.786154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cc87971f-e8fc-454d-8513-957a0bbad389-serviceca\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.786220 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kj8\" (UniqueName: \"kubernetes.io/projected/cc87971f-e8fc-454d-8513-957a0bbad389-kube-api-access-q2kj8\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.786246 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc87971f-e8fc-454d-8513-957a0bbad389-host\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.793788 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.801332 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.818179 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.851760 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.866749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.871729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.871805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.871819 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.871835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.871845 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.878798 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.887702 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cc87971f-e8fc-454d-8513-957a0bbad389-serviceca\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.887754 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kj8\" (UniqueName: \"kubernetes.io/projected/cc87971f-e8fc-454d-8513-957a0bbad389-kube-api-access-q2kj8\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.887774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc87971f-e8fc-454d-8513-957a0bbad389-host\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.887823 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc87971f-e8fc-454d-8513-957a0bbad389-host\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.888684 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cc87971f-e8fc-454d-8513-957a0bbad389-serviceca\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.895562 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.904625 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.911757 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.918613 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kj8\" (UniqueName: \"kubernetes.io/projected/cc87971f-e8fc-454d-8513-957a0bbad389-kube-api-access-q2kj8\") pod \"node-ca-kr988\" (UID: \"cc87971f-e8fc-454d-8513-957a0bbad389\") " pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.922357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.936713 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.948239 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.958775 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.967832 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.973842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.973880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.973892 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.973909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:04 crc kubenswrapper[4775]: I0321 04:49:04.973921 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:04Z","lastTransitionTime":"2026-03-21T04:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.059148 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kr988" Mar 21 04:49:05 crc kubenswrapper[4775]: W0321 04:49:05.070487 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc87971f_e8fc_454d_8513_957a0bbad389.slice/crio-45d54ab10f38d7c9c90f7ae9127338cd3e665ac080edc58a170dfeae4b30c23b WatchSource:0}: Error finding container 45d54ab10f38d7c9c90f7ae9127338cd3e665ac080edc58a170dfeae4b30c23b: Status 404 returned error can't find the container with id 45d54ab10f38d7c9c90f7ae9127338cd3e665ac080edc58a170dfeae4b30c23b Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.083472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.083517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.083529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.083552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.083564 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.185506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.185556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.185568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.185587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.185599 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.288825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.288868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.288880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.288898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.288914 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.391522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.391547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.391555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.391568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.391577 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.494068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.494143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.494156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.494174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.494186 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.596277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.596311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.596320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.596333 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.596342 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.648249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.648298 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.654068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.654131 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.654145 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.655245 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kr988" event={"ID":"cc87971f-e8fc-454d-8513-957a0bbad389","Type":"ContainerStarted","Data":"88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.655274 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kr988" event={"ID":"cc87971f-e8fc-454d-8513-957a0bbad389","Type":"ContainerStarted","Data":"45d54ab10f38d7c9c90f7ae9127338cd3e665ac080edc58a170dfeae4b30c23b"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.657011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerStarted","Data":"54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.661445 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.661500 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: E0321 04:49:05.661584 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.672620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.681715 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.692942 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.697975 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.698008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.698017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.698031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.698041 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.710925 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.721362 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.736482 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.748000 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.759331 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.770099 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.781175 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.789088 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.798379 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.799827 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.799854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.799863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.799877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.799899 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.810861 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.819087 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.832081 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.843296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.853303 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.862581 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.874744 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.896768 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.902226 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.902278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.902290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.902311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.902329 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:05Z","lastTransitionTime":"2026-03-21T04:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.910312 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.929218 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.941156 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.950921 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.958845 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.967852 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:05 crc kubenswrapper[4775]: I0321 04:49:05.975355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.005224 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.005282 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.005295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.005317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.005331 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.112323 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.112876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.112889 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.112910 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.112922 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.215584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.215628 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.215640 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.215657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.215668 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.318361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.318418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.318428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.318447 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.318458 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.421130 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.421182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.421194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.421212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.421406 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.524284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.524323 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.524340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.524360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.524374 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.627396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.627873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.627887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.627906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.627918 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.660433 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:06 crc kubenswrapper[4775]: E0321 04:49:06.660570 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.660501 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:06 crc kubenswrapper[4775]: E0321 04:49:06.660904 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.665789 4775 generic.go:334] "Generic (PLEG): container finished" podID="957cde70-ca20-438a-a4bf-42481dddb2db" containerID="54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7" exitCode=0 Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.665830 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerDied","Data":"54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.685439 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.698525 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.718969 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.730687 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.730965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.730996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.731004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.731018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.731027 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.743982 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.756561 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.769754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.782751 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.793352 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.808399 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.824900 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.834056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.834080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.834089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.834101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.834109 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.837929 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.848282 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.863307 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.937148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.937192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.937262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.937280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:06 crc kubenswrapper[4775]: I0321 04:49:06.937292 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:06Z","lastTransitionTime":"2026-03-21T04:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.039565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.039606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.039618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.039638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.039650 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.142210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.142251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.142263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.142279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.142291 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.244596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.244645 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.244658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.244675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.244689 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.347229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.347277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.347287 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.347305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.347317 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.449795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.449852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.449864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.449886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.449896 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.552857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.552889 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.552897 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.552911 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.552921 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.655683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.655728 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.655737 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.655750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.655759 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.661337 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:07 crc kubenswrapper[4775]: E0321 04:49:07.661459 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.671747 4775 generic.go:334] "Generic (PLEG): container finished" podID="957cde70-ca20-438a-a4bf-42481dddb2db" containerID="f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785" exitCode=0 Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.671808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerDied","Data":"f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.674320 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.679075 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.693318 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.702965 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.714448 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.723746 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.733491 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.748715 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.760331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.760364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.760372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.760385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.760393 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.762023 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.773545 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.785304 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.795626 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.813827 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.826065 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.853780 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.862341 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.862492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.862568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.862633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.862700 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.871518 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.887079 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.903787 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.915835 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.928210 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.940963 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.953388 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.963770 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.965916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.966019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.966076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.966152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.966230 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:07Z","lastTransitionTime":"2026-03-21T04:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.974541 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:07 crc kubenswrapper[4775]: I0321 04:49:07.988302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.001814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.012616 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.024328 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.034777 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.069138 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.069409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.069506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.069605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.069682 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.167680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.167731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.167743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.167762 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.167773 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.181216 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.184570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.184619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.184636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.184656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.184669 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.209953 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.213763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.213803 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.213815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.213831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.213844 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.229674 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.233039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.233066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.233075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.233089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.233098 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.245952 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.249004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.249035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.249052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.249067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.249078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.260135 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.260296 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.262092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.262149 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.262161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.262180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.262191 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.323817 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.323926 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.323951 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.324040 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:49:24.324009027 +0000 UTC m=+117.300472661 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.324052 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.324145 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.324206 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:24.324194433 +0000 UTC m=+117.300658107 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.324257 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:24.324244614 +0000 UTC m=+117.300708238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.363915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.363961 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.363974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.363993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.364006 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.424975 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.425021 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425170 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425187 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425200 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425250 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:24.425236455 +0000 UTC m=+117.401700079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425277 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425329 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425343 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.425408 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:24.425389439 +0000 UTC m=+117.401853123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.466785 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.466822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.466833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.466849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.466860 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.569220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.569518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.569536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.569554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.569566 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.660604 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.660632 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.660744 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:08 crc kubenswrapper[4775]: E0321 04:49:08.661142 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.671542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.671576 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.671587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.671611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.671624 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.681428 4775 generic.go:334] "Generic (PLEG): container finished" podID="957cde70-ca20-438a-a4bf-42481dddb2db" containerID="362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5" exitCode=0 Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.681496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerDied","Data":"362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.685140 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.692007 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.702081 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.715479 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.726575 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.738003 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.755460 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.772640 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.773603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.773626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.773634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.773647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.773656 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.784296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.795151 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.805211 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.814072 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.824639 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.838073 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.848448 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:08Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.875847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.875879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.875888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.875901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.875911 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.978617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.978661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.978674 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.978692 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:08 crc kubenswrapper[4775]: I0321 04:49:08.978705 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:08Z","lastTransitionTime":"2026-03-21T04:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.081213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.081255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.081263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.081294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.081306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.184546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.184602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.184615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.184634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.184647 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.286523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.286558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.286567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.286580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.286591 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.389491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.389543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.389557 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.389572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.389586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.491966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.492040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.492060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.492091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.492111 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.594802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.594869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.594880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.594894 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.594903 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.661046 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:09 crc kubenswrapper[4775]: E0321 04:49:09.661196 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.693228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.694603 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.694750 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.694806 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.698583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.698623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.698639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.698659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.698674 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.710499 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.726409 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.736158 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.738945 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.743004 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.755580 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.776845 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.790680 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.801434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.801470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.801491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.801506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.801517 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.811461 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.826380 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.842275 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.856419 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.870962 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.882345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.891357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.903895 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.903938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.903949 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.903966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.903978 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:09Z","lastTransitionTime":"2026-03-21T04:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.906053 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.917922 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.930246 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.947541 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.962253 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:09 crc kubenswrapper[4775]: I0321 04:49:09.979313 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.000188 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.006456 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.006784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.006796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.006815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.006825 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.019653 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.033160 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.044213 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.055846 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.066807 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.079810 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.096831 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.108604 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.109147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.109205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.109242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.109263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.109275 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.211033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.211084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.211095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.211110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.211137 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.313168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.313210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.313220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.313236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.313246 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.415313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.415355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.415367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.415387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.415399 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.517627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.517663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.517672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.517686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.517696 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.621274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.621348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.621370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.621393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.621411 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.661305 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.661396 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:10 crc kubenswrapper[4775]: E0321 04:49:10.661539 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:10 crc kubenswrapper[4775]: E0321 04:49:10.661733 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.697305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.707697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerStarted","Data":"56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.710289 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.723150 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.724205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.724277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.724288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.724301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.724312 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.735947 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.748265 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.771048 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.782388 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.798202 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.808331 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.817169 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.826802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.826841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.826849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.826868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.826877 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.833828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.846281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.858051 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.868673 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.881422 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.898935 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.913938 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.929761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.929804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.929818 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.929833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.929845 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:10Z","lastTransitionTime":"2026-03-21T04:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.934072 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.959828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.976809 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:10 crc kubenswrapper[4775]: I0321 04:49:10.999451 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.011728 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.023111 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.032529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.032590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.032641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.032657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.032665 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.038817 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.054479 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.070021 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.082751 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.094481 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.107873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.134383 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.134421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.134429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.134445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.134466 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.236889 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.236922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.236931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.236944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.236954 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.339525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.339575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.339586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.339601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.339946 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.443139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.443192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.443207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.443227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.443241 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.544880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.544918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.544930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.544945 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.544955 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.647643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.647677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.647686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.647699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.647709 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.661411 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:11 crc kubenswrapper[4775]: E0321 04:49:11.661566 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.749546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.749572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.749580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.749593 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.749602 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.852405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.852764 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.852774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.852789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.852799 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.955149 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.955210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.955230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.955255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:11 crc kubenswrapper[4775]: I0321 04:49:11.955269 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:11Z","lastTransitionTime":"2026-03-21T04:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.057889 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.057926 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.057935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.057947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.057955 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:12Z","lastTransitionTime":"2026-03-21T04:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.159999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.160041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.160056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.160072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.160084 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:12Z","lastTransitionTime":"2026-03-21T04:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.262403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.262444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.262459 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.262476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.262488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:12Z","lastTransitionTime":"2026-03-21T04:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.365385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.365449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.365461 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.365481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.365496 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:12Z","lastTransitionTime":"2026-03-21T04:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.467694 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.467732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.467740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.467756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.467765 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:12Z","lastTransitionTime":"2026-03-21T04:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.570012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.570061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.570071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.570087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.570098 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:12Z","lastTransitionTime":"2026-03-21T04:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.661316 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.661407 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:12 crc kubenswrapper[4775]: E0321 04:49:12.661513 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:12 crc kubenswrapper[4775]: E0321 04:49:12.661653 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.673038 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.673082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.673094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.673106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:12 crc kubenswrapper[4775]: I0321 04:49:12.673128 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:12Z","lastTransitionTime":"2026-03-21T04:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.744978 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:13 crc kubenswrapper[4775]: E0321 04:49:13.745136 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.748846 4775 generic.go:334] "Generic (PLEG): container finished" podID="957cde70-ca20-438a-a4bf-42481dddb2db" containerID="56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd" exitCode=0 Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.748940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.749039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:13 crc kubenswrapper[4775]: E0321 04:49:13.749053 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.749065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.749075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.749096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.749106 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:13Z","lastTransitionTime":"2026-03-21T04:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.761721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerDied","Data":"56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd"} Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.788844 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.804425 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.830757 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.846385 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.851138 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.851174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.851185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.851202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.851214 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:13Z","lastTransitionTime":"2026-03-21T04:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.859914 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.870347 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.884078 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.897322 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.912861 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.929145 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.953614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.953641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.953649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.953661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.953674 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:13Z","lastTransitionTime":"2026-03-21T04:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.954781 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.973764 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.986213 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:13 crc kubenswrapper[4775]: I0321 04:49:13.997694 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.057428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.057457 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.057465 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.057478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.057487 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.149957 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz"] Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.150422 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.152287 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.153063 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.160756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.160791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.160801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.160816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.160827 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.175256 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.188339 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.198858 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.212154 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.221620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.232825 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.247375 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.249589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.249754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.249844 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4d4t\" (UniqueName: \"kubernetes.io/projected/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-kube-api-access-l4d4t\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.249955 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.258197 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.262590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.262634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.262646 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.262662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.262673 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.271968 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.283617 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.294726 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.305256 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.322232 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.347707 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.351644 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.351708 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.351729 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4d4t\" (UniqueName: \"kubernetes.io/projected/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-kube-api-access-l4d4t\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.351750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.352934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.353251 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.359899 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.364810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.364843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.364853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.364868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.364878 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.371675 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4d4t\" (UniqueName: \"kubernetes.io/projected/38cb395c-744c-4c2e-9e32-b6cb206a9c5b-kube-api-access-l4d4t\") pod \"ovnkube-control-plane-749d76644c-62jtz\" (UID: \"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.379392 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.467421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.467624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.467766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.467909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.468049 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.475610 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" Mar 21 04:49:14 crc kubenswrapper[4775]: W0321 04:49:14.510326 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38cb395c_744c_4c2e_9e32_b6cb206a9c5b.slice/crio-76f13cfa83a1aa67a2dd389b280570ab2bbbdd5b23c63625d56e8d41b50a46d4 WatchSource:0}: Error finding container 76f13cfa83a1aa67a2dd389b280570ab2bbbdd5b23c63625d56e8d41b50a46d4: Status 404 returned error can't find the container with id 76f13cfa83a1aa67a2dd389b280570ab2bbbdd5b23c63625d56e8d41b50a46d4 Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.575331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.575367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.575377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.575392 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.575401 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.661393 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:14 crc kubenswrapper[4775]: E0321 04:49:14.661734 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.661799 4775 scope.go:117] "RemoveContainer" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.677436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.677481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.677491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.677508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.677519 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.754207 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerStarted","Data":"b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.755973 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" event={"ID":"38cb395c-744c-4c2e-9e32-b6cb206a9c5b","Type":"ContainerStarted","Data":"76f13cfa83a1aa67a2dd389b280570ab2bbbdd5b23c63625d56e8d41b50a46d4"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.766934 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.780545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.780570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.780593 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.780607 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.780616 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.781157 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.792697 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.806150 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.823681 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.838407 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.863589 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.875039 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.882576 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.882608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.882618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.882635 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.882647 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.885111 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.898237 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.909246 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.920832 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.929108 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.942699 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.954625 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.984944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.984979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.984988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.985003 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:14 crc kubenswrapper[4775]: I0321 04:49:14.985011 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:14Z","lastTransitionTime":"2026-03-21T04:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.087574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.087613 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.087622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.087637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.087647 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.190102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.190153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.190165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.190179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.190190 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.293005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.293039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.293050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.293067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.293078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.395976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.396030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.396043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.396060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.396072 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.499034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.499076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.499089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.499108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.499135 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.601531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.601567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.601577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.601592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.601601 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.661069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.661248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:15 crc kubenswrapper[4775]: E0321 04:49:15.661600 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:15 crc kubenswrapper[4775]: E0321 04:49:15.661745 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.703915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.703963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.703980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.704023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.704040 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.763341 4775 generic.go:334] "Generic (PLEG): container finished" podID="957cde70-ca20-438a-a4bf-42481dddb2db" containerID="b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1" exitCode=0 Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.763384 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerDied","Data":"b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.790032 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.806111 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.809485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.809535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.809546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.809564 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.809574 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.831346 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.848477 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.864356 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.878580 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.894135 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.904488 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.911739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.911776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.911787 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.911802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.911813 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:15Z","lastTransitionTime":"2026-03-21T04:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.915179 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.929698 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.941183 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.954617 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.968156 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.978519 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.990232 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:15Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.992036 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xk9f5"] Mar 21 04:49:15 crc kubenswrapper[4775]: I0321 04:49:15.992567 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:15 crc kubenswrapper[4775]: E0321 04:49:15.992619 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.001580 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.014424 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.014939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.015026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.015040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.015063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.015074 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.025021 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.036463 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.047406 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.057810 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.067323 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.082029 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68dp\" (UniqueName: \"kubernetes.io/projected/6920413a-2c51-466d-a16e-d14489ae0c6c-kube-api-access-h68dp\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.082067 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.083522 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.093764 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.111633 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.116727 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.116756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.116763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.116776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.116785 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.122853 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.131515 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.143105 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.154742 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.167934 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.179696 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.183004 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h68dp\" (UniqueName: \"kubernetes.io/projected/6920413a-2c51-466d-a16e-d14489ae0c6c-kube-api-access-h68dp\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.183045 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:16 crc kubenswrapper[4775]: E0321 04:49:16.183278 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:16 crc kubenswrapper[4775]: E0321 04:49:16.183363 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:49:16.683343475 +0000 UTC m=+109.659807099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.202137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h68dp\" (UniqueName: \"kubernetes.io/projected/6920413a-2c51-466d-a16e-d14489ae0c6c-kube-api-access-h68dp\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.219872 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.219904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.219912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.219925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.219933 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.323058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.323094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.323104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.323133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.323144 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.425629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.425661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.425672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.425687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.425699 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.528254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.528289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.528299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.528315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.528326 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.630816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.630852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.630863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.630880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.630889 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.660464 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:16 crc kubenswrapper[4775]: E0321 04:49:16.660577 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.687700 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:16 crc kubenswrapper[4775]: E0321 04:49:16.687831 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:16 crc kubenswrapper[4775]: E0321 04:49:16.687899 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:49:17.687880058 +0000 UTC m=+110.664343682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.733053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.733100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.733110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.733157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.733169 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.768156 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.769379 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.770577 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.772626 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" event={"ID":"38cb395c-744c-4c2e-9e32-b6cb206a9c5b","Type":"ContainerStarted","Data":"226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.772680 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" event={"ID":"38cb395c-744c-4c2e-9e32-b6cb206a9c5b","Type":"ContainerStarted","Data":"99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.783836 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" event={"ID":"957cde70-ca20-438a-a4bf-42481dddb2db","Type":"ContainerStarted","Data":"12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.786035 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/0.log" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.788493 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4" exitCode=1 Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.788534 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.789249 4775 scope.go:117] "RemoveContainer" containerID="55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.804292 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.816416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.834792 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.835100 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.835133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.835145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.835158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.835166 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.848683 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.859926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.874421 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.886281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.897202 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.911145 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.921279 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.936470 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.937667 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.937698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.937708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.937724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.937732 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:16Z","lastTransitionTime":"2026-03-21T04:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.948051 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.961531 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.976935 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:16 crc kubenswrapper[4775]: I0321 04:49:16.990922 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.004762 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.028477 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.040229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.040268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.040279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.040295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.040305 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.045995 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.058042 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.073275 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.084349 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.101828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.115636 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.127182 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.142018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.142108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.142140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.142161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.142175 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.144338 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI0321 04:49:15.347720 6490 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0321 04:49:15.348031 6490 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:49:15.348133 6490 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0321 04:49:15.348148 6490 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:49:15.348174 6490 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0321 04:49:15.348468 6490 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0321 04:49:15.348502 6490 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:49:15.348771 6490 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:49:15.348774 6490 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:49:15.348797 6490 factory.go:656] Stopping watch factory\\\\nI0321 04:49:15.348808 6490 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.165887 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.183218 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.198299 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.211298 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.222086 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.235103 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.244690 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.244719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.244730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.244745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.244756 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.247900 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.347643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.347714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.347726 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.347743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.347756 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.450208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.450245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.450254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.450268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.450276 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.553235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.553273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.553282 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.553296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.553305 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.659333 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.659376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.659388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.659403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.659413 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.660671 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.660717 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.660747 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:17 crc kubenswrapper[4775]: E0321 04:49:17.660849 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:17 crc kubenswrapper[4775]: E0321 04:49:17.660933 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:17 crc kubenswrapper[4775]: E0321 04:49:17.661042 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.676618 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.690187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.701357 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:17 crc kubenswrapper[4775]: E0321 04:49:17.701599 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:17 crc kubenswrapper[4775]: E0321 04:49:17.701710 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:49:19.701682356 +0000 UTC m=+112.678146020 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.703273 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.713349 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.734163 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.750232 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.761608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.761649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.761664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.761681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.761723 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.770707 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI0321 04:49:15.347720 6490 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0321 04:49:15.348031 6490 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:49:15.348133 6490 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0321 04:49:15.348148 6490 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:49:15.348174 6490 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0321 04:49:15.348468 6490 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0321 04:49:15.348502 6490 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:49:15.348771 6490 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:49:15.348774 6490 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:49:15.348797 6490 factory.go:656] Stopping watch factory\\\\nI0321 04:49:15.348808 6490 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.783924 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.793033 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/0.log" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.796195 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.796577 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.808898 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.823954 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.833380 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.843237 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.853775 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.864094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.864142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.864162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.864179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.864189 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.868195 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.882786 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.895412 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.910814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.926847 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.940635 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.954624 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.965850 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.967322 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.967357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.967366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.967379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.967389 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:17Z","lastTransitionTime":"2026-03-21T04:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.976756 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:17 crc kubenswrapper[4775]: I0321 04:49:17.990510 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.001708 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.015508 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.028893 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.040935 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.052860 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.071288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.071339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.071351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.071366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.071381 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.074632 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.089269 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.110055 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI0321 04:49:15.347720 6490 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0321 04:49:15.348031 6490 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:49:15.348133 6490 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0321 04:49:15.348148 6490 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:49:15.348174 6490 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0321 04:49:15.348468 6490 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0321 04:49:15.348502 6490 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:49:15.348771 6490 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:49:15.348774 6490 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:49:15.348797 6490 factory.go:656] Stopping watch factory\\\\nI0321 04:49:15.348808 6490 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.173542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.173581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.173594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.173611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.173623 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.276025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.276055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.276064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.276080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.276089 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.307468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.307512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.307527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.307545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.307558 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.319319 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.322742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.322780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.322792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.322812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.322824 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.335913 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.339584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.339620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.339629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.339655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.339665 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.352372 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.356001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.356050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.356063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.356087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.356104 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.369842 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.373663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.373704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.373717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.373755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.373768 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.387244 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.387412 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.388884 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.388930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.388942 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.388959 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.388972 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.491427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.491491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.491506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.491532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.491543 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.594796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.594854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.594867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.594888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.594903 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.660544 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.660679 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.698298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.698347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.698357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.698375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.698387 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.800814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.800864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.800877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.800897 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.800913 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.802020 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/1.log" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.802551 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/0.log" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.804769 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e" exitCode=1 Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.804805 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.804841 4775 scope.go:117] "RemoveContainer" containerID="55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.805465 4775 scope.go:117] "RemoveContainer" containerID="0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e" Mar 21 04:49:18 crc kubenswrapper[4775]: E0321 04:49:18.805697 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.819031 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.829137 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.842668 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.853272 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.865176 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.880692 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.894714 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.903255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.903304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.903314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.903330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.903343 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:18Z","lastTransitionTime":"2026-03-21T04:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.913435 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.932467 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI0321 04:49:15.347720 6490 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0321 04:49:15.348031 6490 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:49:15.348133 6490 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0321 04:49:15.348148 6490 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:49:15.348174 6490 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0321 04:49:15.348468 6490 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0321 04:49:15.348502 6490 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:49:15.348771 6490 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:49:15.348774 6490 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:49:15.348797 6490 factory.go:656] Stopping watch factory\\\\nI0321 04:49:15.348808 6490 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:17Z\\\",\\\"message\\\":\\\"networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880463 6825 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-khh7x\\\\nI0321 04:49:17.880734 6825 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880494 6825 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 04:49:17.880753 6825 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF0321 04:49:17.880790 6825 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.951034 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.967315 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:18 crc kubenswrapper[4775]: I0321 04:49:18.983477 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.003222 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:19Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.006080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.006169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.006189 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.006213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.006228 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.018093 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:19Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.030488 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:19Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.049470 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:19Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.109606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.109663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.109676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.109697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.109710 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.212195 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.212238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.212246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.212261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.212272 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.314502 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.314543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.314554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.314568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.314579 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.417043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.417091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.417147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.417172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.417182 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.519372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.519409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.519429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.519447 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.519457 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.621559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.621654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.621676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.621701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.621715 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.662237 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:19 crc kubenswrapper[4775]: E0321 04:49:19.663032 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.662381 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:19 crc kubenswrapper[4775]: E0321 04:49:19.663139 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.662335 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:19 crc kubenswrapper[4775]: E0321 04:49:19.663198 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.723366 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:19 crc kubenswrapper[4775]: E0321 04:49:19.723480 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:19 crc kubenswrapper[4775]: E0321 04:49:19.723543 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:49:23.723526033 +0000 UTC m=+116.699989657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.724927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.724965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.724976 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.724993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.725019 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.809346 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/1.log" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.827824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.827879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.827893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.827910 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.827919 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.930012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.930048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.930064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.930082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:19 crc kubenswrapper[4775]: I0321 04:49:19.930092 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:19Z","lastTransitionTime":"2026-03-21T04:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.033065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.033164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.033179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.033205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.033222 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.135542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.135603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.135611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.135628 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.135638 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.243318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.243368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.243380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.243398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.243409 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.346360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.346714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.346727 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.346744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.346756 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.449271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.449313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.449324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.449343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.449356 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.551618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.551659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.551669 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.551684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.551697 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.659441 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.659490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.659501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.659519 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.659552 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.660529 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:20 crc kubenswrapper[4775]: E0321 04:49:20.660781 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.762774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.762816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.762828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.762847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.762859 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.864888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.864935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.864946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.864962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.864973 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.968351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.968419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.968439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.968467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:20 crc kubenswrapper[4775]: I0321 04:49:20.968488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:20Z","lastTransitionTime":"2026-03-21T04:49:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.071665 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.071712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.071724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.071741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.071754 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.175596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.175628 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.175637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.175650 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.175658 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.278863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.278915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.278927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.278947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.278960 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.381307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.381339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.381347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.381360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.381369 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.484329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.484399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.484417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.484446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.484469 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.586836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.586881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.586890 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.586907 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.586923 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.661149 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.661184 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.661302 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:21 crc kubenswrapper[4775]: E0321 04:49:21.661467 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:21 crc kubenswrapper[4775]: E0321 04:49:21.661648 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:21 crc kubenswrapper[4775]: E0321 04:49:21.661826 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.689521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.689574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.689586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.689604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.689621 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.792385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.792414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.792425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.792440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.792451 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.894509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.894546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.894557 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.894573 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.894583 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.997191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.997524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.997653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.997784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:21 crc kubenswrapper[4775]: I0321 04:49:21.997915 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:21Z","lastTransitionTime":"2026-03-21T04:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.099919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.099984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.099998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.100019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.100030 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.202239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.202294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.202304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.202320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.202333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.304558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.304636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.304651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.304676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.304692 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.408010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.408079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.408102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.408148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.408161 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.510651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.510909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.511049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.511263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.511430 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.614163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.614512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.614661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.614753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.614941 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.661059 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:22 crc kubenswrapper[4775]: E0321 04:49:22.661441 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.716956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.717001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.717017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.717037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.717052 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.820431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.820496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.820508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.820530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.820550 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.923432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.923482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.923491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.923506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:22 crc kubenswrapper[4775]: I0321 04:49:22.923515 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:22Z","lastTransitionTime":"2026-03-21T04:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.027151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.027530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.027786 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.027967 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.028172 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.132217 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.132614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.132688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.132774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.132916 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.236395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.236933 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.237021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.237180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.237299 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.340342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.340583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.340700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.340801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.340903 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.443794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.443837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.443851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.443866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.443876 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.549756 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.549796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.549806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.549821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.549831 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.652640 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.652683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.652694 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.652709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.652720 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.660981 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.661029 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.661090 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:23 crc kubenswrapper[4775]: E0321 04:49:23.661087 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:23 crc kubenswrapper[4775]: E0321 04:49:23.661184 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:23 crc kubenswrapper[4775]: E0321 04:49:23.661279 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.755089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.755163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.755174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.755193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.755206 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.767645 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:23 crc kubenswrapper[4775]: E0321 04:49:23.767764 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:23 crc kubenswrapper[4775]: E0321 04:49:23.767817 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:49:31.767803403 +0000 UTC m=+124.744267027 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.857430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.857463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.857472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.857484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.857493 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.959803 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.960255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.960287 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.960308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:23 crc kubenswrapper[4775]: I0321 04:49:23.960318 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:23Z","lastTransitionTime":"2026-03-21T04:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.063468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.063515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.063526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.063543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.063555 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.166193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.166239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.166253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.166271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.166282 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.268507 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.268546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.268558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.268574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.268587 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.371851 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.371969 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.371994 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.372045 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:49:56.372020119 +0000 UTC m=+149.348483763 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.372058 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.372106 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.372128 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:56.372103431 +0000 UTC m=+149.348567055 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.372164 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:56.372151203 +0000 UTC m=+149.348614827 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.372293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.372306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.372314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.372327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.372335 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.472864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.472907 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473002 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473025 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473036 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473054 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473072 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473084 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473085 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:56.473069762 +0000 UTC m=+149.449533386 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.473183 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:49:56.473169704 +0000 UTC m=+149.449633328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.474184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.474297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.474391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.474473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.474551 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.576841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.577207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.577294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.577383 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.577523 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.660774 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:24 crc kubenswrapper[4775]: E0321 04:49:24.660903 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.679617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.679887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.679968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.680058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.680156 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.783396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.783438 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.783449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.783467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.783479 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.886094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.886172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.886192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.886214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.886231 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.988673 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.989326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.989396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.989431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:24 crc kubenswrapper[4775]: I0321 04:49:24.989454 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:24Z","lastTransitionTime":"2026-03-21T04:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.091856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.091909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.091928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.091951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.091968 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.194524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.194579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.194607 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.194632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.194650 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.297104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.297167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.297185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.297216 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.297232 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.399181 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.399572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.399728 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.399830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.399935 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.502153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.502193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.502212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.502229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.502241 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.604513 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.604770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.604840 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.604914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.604987 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.661364 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:25 crc kubenswrapper[4775]: E0321 04:49:25.661517 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.661597 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.661617 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:25 crc kubenswrapper[4775]: E0321 04:49:25.661753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:25 crc kubenswrapper[4775]: E0321 04:49:25.661827 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.707420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.707471 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.707484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.707503 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.707515 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.810475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.810517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.810528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.810545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.810556 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.912944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.912990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.913001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.913017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:25 crc kubenswrapper[4775]: I0321 04:49:25.913028 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:25Z","lastTransitionTime":"2026-03-21T04:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.015436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.015511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.015522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.015571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.015588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.117730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.117775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.117786 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.117805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.117817 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.220298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.220348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.220364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.220386 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.220402 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.323486 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.323543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.323565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.323592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.323612 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.426594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.426666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.426687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.426712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.426729 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.529296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.529534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.529626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.529718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.529809 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.632801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.633108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.633240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.633343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.633445 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.661139 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:26 crc kubenswrapper[4775]: E0321 04:49:26.661267 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.735158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.735199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.735212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.735227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.735241 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.836556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.836606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.836618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.836632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.836642 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.938518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.938592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.938604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.938619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:26 crc kubenswrapper[4775]: I0321 04:49:26.938630 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:26Z","lastTransitionTime":"2026-03-21T04:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.040730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.040764 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.040773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.040787 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.040796 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:27Z","lastTransitionTime":"2026-03-21T04:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.142952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.142995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.143008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.143024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.143035 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:27Z","lastTransitionTime":"2026-03-21T04:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.246081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.246152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.246167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.246186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.246197 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:27Z","lastTransitionTime":"2026-03-21T04:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.349295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.349329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.349340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.349355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.349365 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:27Z","lastTransitionTime":"2026-03-21T04:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.451349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.451623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.451699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.451763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.451822 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:27Z","lastTransitionTime":"2026-03-21T04:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.554218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.554463 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.554554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.554651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.554742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:27Z","lastTransitionTime":"2026-03-21T04:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:27 crc kubenswrapper[4775]: E0321 04:49:27.655193 4775 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.660337 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.660420 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:27 crc kubenswrapper[4775]: E0321 04:49:27.660605 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:27 crc kubenswrapper[4775]: E0321 04:49:27.660641 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.660348 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:27 crc kubenswrapper[4775]: E0321 04:49:27.661697 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.678830 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.698793 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.710786 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.726015 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.746144 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.762598 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.785097 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55cbe34c19fee4a7423b106752c0f2fc2f2beebb0b87e92ad58b6ef9fc166db4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"message\\\":\\\"factory.go:160\\\\nI0321 04:49:15.347720 6490 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0321 04:49:15.348031 6490 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:49:15.348133 6490 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0321 04:49:15.348148 6490 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:49:15.348174 6490 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0321 04:49:15.348468 6490 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0321 04:49:15.348502 6490 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:49:15.348771 6490 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:49:15.348774 6490 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:49:15.348797 6490 factory.go:656] Stopping watch factory\\\\nI0321 04:49:15.348808 6490 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:17Z\\\",\\\"message\\\":\\\"networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880463 6825 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-khh7x\\\\nI0321 04:49:17.880734 6825 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880494 6825 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 04:49:17.880753 6825 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF0321 04:49:17.880790 6825 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.798388 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.812184 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.826197 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.840699 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.849954 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.860723 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.871561 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.887453 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:27 crc kubenswrapper[4775]: I0321 04:49:27.898793 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.661307 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.661435 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.745498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.745543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.745554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.745571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.745585 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:28Z","lastTransitionTime":"2026-03-21T04:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.751815 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.759506 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.763554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.763587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.763596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.763609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.763618 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:28Z","lastTransitionTime":"2026-03-21T04:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.777168 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.782240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.782299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.782318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.782342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.782363 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:28Z","lastTransitionTime":"2026-03-21T04:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.798278 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.802342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.802379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.802391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.802406 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.802416 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:28Z","lastTransitionTime":"2026-03-21T04:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.816592 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.820131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.820173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.820186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.820208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:28 crc kubenswrapper[4775]: I0321 04:49:28.820225 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:28Z","lastTransitionTime":"2026-03-21T04:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.831427 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:28 crc kubenswrapper[4775]: E0321 04:49:28.831565 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:49:29 crc kubenswrapper[4775]: I0321 04:49:29.660607 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:29 crc kubenswrapper[4775]: I0321 04:49:29.660707 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:29 crc kubenswrapper[4775]: E0321 04:49:29.660736 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:29 crc kubenswrapper[4775]: I0321 04:49:29.660788 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:29 crc kubenswrapper[4775]: E0321 04:49:29.660888 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:29 crc kubenswrapper[4775]: E0321 04:49:29.660994 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:30 crc kubenswrapper[4775]: I0321 04:49:30.661242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:30 crc kubenswrapper[4775]: E0321 04:49:30.661376 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:31 crc kubenswrapper[4775]: I0321 04:49:31.660784 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:31 crc kubenswrapper[4775]: E0321 04:49:31.660895 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:31 crc kubenswrapper[4775]: I0321 04:49:31.660796 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:31 crc kubenswrapper[4775]: E0321 04:49:31.660961 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:31 crc kubenswrapper[4775]: I0321 04:49:31.661425 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:31 crc kubenswrapper[4775]: E0321 04:49:31.661643 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:31 crc kubenswrapper[4775]: I0321 04:49:31.848220 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:31 crc kubenswrapper[4775]: E0321 04:49:31.848358 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:31 crc kubenswrapper[4775]: E0321 04:49:31.848511 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:49:47.848464609 +0000 UTC m=+140.824928233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.661287 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:32 crc kubenswrapper[4775]: E0321 04:49:32.661480 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.834449 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.836216 4775 scope.go:117] "RemoveContainer" containerID="0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.863864 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:17Z\\\",\\\"message\\\":\\\"networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880463 6825 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-khh7x\\\\nI0321 04:49:17.880734 6825 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880494 6825 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 04:49:17.880753 6825 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF0321 04:49:17.880790 6825 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.890365 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.906264 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.921918 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.943553 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.953963 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.964874 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.977789 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:32 crc kubenswrapper[4775]: I0321 04:49:32.995345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:32Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.007948 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.018990 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.033380 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.046504 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.060612 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.074103 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.085723 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.136915 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.155530 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.169324 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.191778 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:17Z\\\",\\\"message\\\":\\\"networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880463 6825 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-khh7x\\\\nI0321 04:49:17.880734 6825 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880494 6825 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 04:49:17.880753 6825 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF0321 04:49:17.880790 6825 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.206665 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.217807 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.229242 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.245416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.259768 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.274075 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.285695 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.300807 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.312554 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.330467 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.344134 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.356548 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.368362 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.661139 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.661180 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.661223 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:33 crc kubenswrapper[4775]: E0321 04:49:33.661264 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:33 crc kubenswrapper[4775]: E0321 04:49:33.661374 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:33 crc kubenswrapper[4775]: E0321 04:49:33.661453 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:33 crc kubenswrapper[4775]: E0321 04:49:33.753532 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.860774 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/1.log" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.863769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b"} Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.864249 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.877764 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.894961 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.911700 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.926425 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.940877 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.954817 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.970189 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:33 crc kubenswrapper[4775]: I0321 04:49:33.995335 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:33Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.009585 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.043475 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:17Z\\\",\\\"message\\\":\\\"networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880463 6825 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-khh7x\\\\nI0321 04:49:17.880734 6825 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880494 6825 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 04:49:17.880753 6825 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF0321 04:49:17.880790 6825 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.056946 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.071284 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.084282 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.101605 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.114239 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.127252 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.660777 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:34 crc kubenswrapper[4775]: E0321 04:49:34.660914 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.868864 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/2.log" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.869444 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/1.log" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.895377 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b" exitCode=1 Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.895432 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b"} Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.895484 4775 scope.go:117] "RemoveContainer" containerID="0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.895956 4775 scope.go:117] "RemoveContainer" containerID="a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b" Mar 21 04:49:34 crc kubenswrapper[4775]: E0321 04:49:34.896145 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.910528 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.921928 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.931696 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.941921 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.954053 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.970263 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:34 crc kubenswrapper[4775]: I0321 04:49:34.985432 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.002396 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.021904 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0286b9718372015f0e06759103c765f3cf62e15f7e0bf343897571c50222731e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:17Z\\\",\\\"message\\\":\\\"networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880463 6825 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-khh7x\\\\nI0321 04:49:17.880734 6825 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0321 04:49:17.880494 6825 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0321 04:49:17.880753 6825 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nF0321 04:49:17.880790 6825 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.040977 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.055667 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.068156 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.080503 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.091389 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.102498 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.118016 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.660680 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.660758 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:35 crc kubenswrapper[4775]: E0321 04:49:35.660911 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.660959 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:35 crc kubenswrapper[4775]: E0321 04:49:35.661144 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:35 crc kubenswrapper[4775]: E0321 04:49:35.661210 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.899454 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/2.log" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.902638 4775 scope.go:117] "RemoveContainer" containerID="a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b" Mar 21 04:49:35 crc kubenswrapper[4775]: E0321 04:49:35.902783 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.915987 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.930332 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.944362 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.956064 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.974431 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:35 crc kubenswrapper[4775]: I0321 04:49:35.994340 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.008639 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.019077 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.030530 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.040161 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.049384 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.060068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.070579 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.081069 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.093255 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.109518 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:36 crc kubenswrapper[4775]: I0321 04:49:36.660545 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:36 crc kubenswrapper[4775]: E0321 04:49:36.660679 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.660415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.660576 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:37 crc kubenswrapper[4775]: E0321 04:49:37.660814 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.660840 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:37 crc kubenswrapper[4775]: E0321 04:49:37.660892 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:37 crc kubenswrapper[4775]: E0321 04:49:37.660965 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.679068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.692437 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.704850 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.716833 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.736806 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.749045 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.772697 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.783609 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.795521 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.808021 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.820318 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.841883 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.853268 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.864480 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.880256 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:37 crc kubenswrapper[4775]: I0321 04:49:37.891953 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:37Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.660320 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.660518 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.754951 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.900667 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.900958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.900973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.900989 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.900998 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:38Z","lastTransitionTime":"2026-03-21T04:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.916981 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:38Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.920501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.920544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.920558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.920575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.920586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:38Z","lastTransitionTime":"2026-03-21T04:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.933577 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:38Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.936743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.936772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.936780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.936792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.936801 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:38Z","lastTransitionTime":"2026-03-21T04:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.947752 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:38Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.951051 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.951150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.951165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.951180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.951192 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:38Z","lastTransitionTime":"2026-03-21T04:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.962778 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:38Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.965209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.965236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.965249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.965264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:38 crc kubenswrapper[4775]: I0321 04:49:38.965274 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:38Z","lastTransitionTime":"2026-03-21T04:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.975496 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:38Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:38 crc kubenswrapper[4775]: E0321 04:49:38.975609 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:49:39 crc kubenswrapper[4775]: I0321 04:49:39.660689 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:39 crc kubenswrapper[4775]: E0321 04:49:39.660950 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:39 crc kubenswrapper[4775]: I0321 04:49:39.661374 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:39 crc kubenswrapper[4775]: E0321 04:49:39.661473 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:39 crc kubenswrapper[4775]: I0321 04:49:39.661708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:39 crc kubenswrapper[4775]: E0321 04:49:39.661783 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:40 crc kubenswrapper[4775]: I0321 04:49:40.661213 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:40 crc kubenswrapper[4775]: E0321 04:49:40.661332 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:41 crc kubenswrapper[4775]: I0321 04:49:41.660821 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:41 crc kubenswrapper[4775]: I0321 04:49:41.660952 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:41 crc kubenswrapper[4775]: E0321 04:49:41.661006 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:41 crc kubenswrapper[4775]: I0321 04:49:41.661088 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:41 crc kubenswrapper[4775]: E0321 04:49:41.661273 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:41 crc kubenswrapper[4775]: E0321 04:49:41.661338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:42 crc kubenswrapper[4775]: I0321 04:49:42.660627 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:42 crc kubenswrapper[4775]: E0321 04:49:42.660808 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:43 crc kubenswrapper[4775]: I0321 04:49:43.660389 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:43 crc kubenswrapper[4775]: I0321 04:49:43.660382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:43 crc kubenswrapper[4775]: E0321 04:49:43.660841 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:43 crc kubenswrapper[4775]: I0321 04:49:43.660521 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:43 crc kubenswrapper[4775]: E0321 04:49:43.660947 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:43 crc kubenswrapper[4775]: E0321 04:49:43.661186 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:43 crc kubenswrapper[4775]: I0321 04:49:43.673036 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 21 04:49:43 crc kubenswrapper[4775]: E0321 04:49:43.755746 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:49:44 crc kubenswrapper[4775]: I0321 04:49:44.660959 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:44 crc kubenswrapper[4775]: E0321 04:49:44.661084 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:45 crc kubenswrapper[4775]: I0321 04:49:45.660434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:45 crc kubenswrapper[4775]: I0321 04:49:45.660527 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:45 crc kubenswrapper[4775]: I0321 04:49:45.660457 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:45 crc kubenswrapper[4775]: E0321 04:49:45.660593 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:45 crc kubenswrapper[4775]: E0321 04:49:45.660665 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:45 crc kubenswrapper[4775]: E0321 04:49:45.660761 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:46 crc kubenswrapper[4775]: I0321 04:49:46.660426 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:46 crc kubenswrapper[4775]: E0321 04:49:46.660558 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.660468 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.660517 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.660555 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:47 crc kubenswrapper[4775]: E0321 04:49:47.660652 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:47 crc kubenswrapper[4775]: E0321 04:49:47.660720 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:47 crc kubenswrapper[4775]: E0321 04:49:47.660771 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.672092 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.688314 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.699697 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.709198 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.721387 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.734509 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.747391 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.758427 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.771434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.781627 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.795612 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.809187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.822553 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.839476 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.850380 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.870915 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.881396 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:47Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:47 crc kubenswrapper[4775]: I0321 04:49:47.918968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:47 crc kubenswrapper[4775]: E0321 04:49:47.919108 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:47 crc kubenswrapper[4775]: E0321 04:49:47.919204 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:50:19.919184945 +0000 UTC m=+172.895648569 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:49:48 crc kubenswrapper[4775]: I0321 04:49:48.661222 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:48 crc kubenswrapper[4775]: E0321 04:49:48.661390 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:48 crc kubenswrapper[4775]: E0321 04:49:48.757348 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.068188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.068444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.068623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.068782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.068923 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:49Z","lastTransitionTime":"2026-03-21T04:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.080636 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.083977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.084003 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.084012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.084024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.084033 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:49Z","lastTransitionTime":"2026-03-21T04:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.094497 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.097982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.098024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.098059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.098079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.098093 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:49Z","lastTransitionTime":"2026-03-21T04:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.111383 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.115304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.115345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.115360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.115431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.115477 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:49Z","lastTransitionTime":"2026-03-21T04:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.130430 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.134339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.134382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.134394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.134411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.134426 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:49Z","lastTransitionTime":"2026-03-21T04:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.147530 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.147797 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.660972 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.660995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.661398 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:49 crc kubenswrapper[4775]: I0321 04:49:49.661856 4775 scope.go:117] "RemoveContainer" containerID="a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b" Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.661536 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.661879 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.661650 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:49 crc kubenswrapper[4775]: E0321 04:49:49.662085 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:49:50 crc kubenswrapper[4775]: I0321 04:49:50.661448 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:50 crc kubenswrapper[4775]: E0321 04:49:50.661661 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:51 crc kubenswrapper[4775]: I0321 04:49:51.660811 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:51 crc kubenswrapper[4775]: E0321 04:49:51.661288 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:51 crc kubenswrapper[4775]: I0321 04:49:51.660897 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:51 crc kubenswrapper[4775]: E0321 04:49:51.661524 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:51 crc kubenswrapper[4775]: I0321 04:49:51.660813 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:51 crc kubenswrapper[4775]: E0321 04:49:51.662001 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:52 crc kubenswrapper[4775]: I0321 04:49:52.660803 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:52 crc kubenswrapper[4775]: E0321 04:49:52.661061 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:52 crc kubenswrapper[4775]: I0321 04:49:52.954957 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/0.log" Mar 21 04:49:52 crc kubenswrapper[4775]: I0321 04:49:52.955000 4775 generic.go:334] "Generic (PLEG): container finished" podID="e77ec218-42da-4f07-b214-184c4f3b20f3" containerID="77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf" exitCode=1 Mar 21 04:49:52 crc kubenswrapper[4775]: I0321 04:49:52.955027 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerDied","Data":"77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf"} Mar 21 04:49:52 crc kubenswrapper[4775]: I0321 04:49:52.955393 4775 scope.go:117] "RemoveContainer" containerID="77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf" Mar 21 04:49:52 crc kubenswrapper[4775]: I0321 04:49:52.970831 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:52 crc kubenswrapper[4775]: I0321 04:49:52.991357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:52Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.007215 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.021348 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.031403 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.040408 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.052042 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.070377 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.082992 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.100981 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.114411 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.126318 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.137157 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.149036 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.168043 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.179969 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.198215 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.661035 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:53 crc kubenswrapper[4775]: E0321 04:49:53.661450 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.661063 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:53 crc kubenswrapper[4775]: E0321 04:49:53.661727 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.661038 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:53 crc kubenswrapper[4775]: E0321 04:49:53.661936 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:53 crc kubenswrapper[4775]: E0321 04:49:53.759554 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.961046 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/0.log" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.961184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerStarted","Data":"eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae"} Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.976869 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:53 crc kubenswrapper[4775]: I0321 04:49:53.994260 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:53Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.010260 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.029216 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.044110 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.059291 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.071909 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.084904 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.106304 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.120230 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.149275 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.165081 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.178260 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.192203 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.207853 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.219268 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.232456 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.660806 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:54 crc kubenswrapper[4775]: E0321 04:49:54.661248 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:54 crc kubenswrapper[4775]: I0321 04:49:54.671828 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 21 04:49:55 crc kubenswrapper[4775]: I0321 04:49:55.661165 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:55 crc kubenswrapper[4775]: I0321 04:49:55.661191 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:55 crc kubenswrapper[4775]: I0321 04:49:55.661217 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:55 crc kubenswrapper[4775]: E0321 04:49:55.661306 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:55 crc kubenswrapper[4775]: E0321 04:49:55.661396 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:55 crc kubenswrapper[4775]: E0321 04:49:55.661482 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:56 crc kubenswrapper[4775]: I0321 04:49:56.415402 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.415527 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:00.415507516 +0000 UTC m=+213.391971140 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:49:56 crc kubenswrapper[4775]: I0321 04:49:56.415650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:56 crc kubenswrapper[4775]: I0321 04:49:56.415760 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.415795 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.415855 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:00.415842895 +0000 UTC m=+213.392306519 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.415889 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.415951 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:00.415940298 +0000 UTC m=+213.392403922 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: I0321 04:49:56.516314 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:56 crc kubenswrapper[4775]: I0321 04:49:56.516357 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516472 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516489 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516500 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516557 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:00.516542081 +0000 UTC m=+213.493005705 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516627 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516692 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516714 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.516801 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:00.516775128 +0000 UTC m=+213.493238752 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:49:56 crc kubenswrapper[4775]: I0321 04:49:56.660851 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:56 crc kubenswrapper[4775]: E0321 04:49:56.660996 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:56 crc kubenswrapper[4775]: I0321 04:49:56.669492 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.660743 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.660801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.660957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:57 crc kubenswrapper[4775]: E0321 04:49:57.660944 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:49:57 crc kubenswrapper[4775]: E0321 04:49:57.661155 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:57 crc kubenswrapper[4775]: E0321 04:49:57.661239 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.675393 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.694892 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.709256 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.720730 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.731967 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.756687 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.774328 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.795492 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.805249 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdcbc41e-a246-40f6-bf50-36d3c217315a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf88511498eb0f6182819ecb78616730fa9bfbe187057ea891f3ba73e550287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.816327 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.826398 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.835873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.846206 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.853963 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.862092 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.872963 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058178a5-0360-40e4-9baf-5478bac349dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5615480a5b452c6bce19fa04738bb5d68984f62258c9486fc2ae787cfb9b7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:47:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:47:29.750929 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:47:29.753390 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:47:29.787802 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:47:29.793292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:47:55.400421 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:47:55.400543 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb44c7daffbeddbb39314ec11bd3235996e19b520875337dcac1ffd14f35953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9299f9db8cbdd2dbd04e47bd1eba4f95724ec3dd04ec5083564dc0c38a960457\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.885096 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.898951 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:57 crc kubenswrapper[4775]: I0321 04:49:57.910172 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:57Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:58 crc kubenswrapper[4775]: I0321 04:49:58.660591 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:49:58 crc kubenswrapper[4775]: E0321 04:49:58.661093 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:49:58 crc kubenswrapper[4775]: E0321 04:49:58.761704 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.501702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.501758 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.501768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.501786 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.501799 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:59Z","lastTransitionTime":"2026-03-21T04:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.516481 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.519929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.519973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.519981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.519995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.520004 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:59Z","lastTransitionTime":"2026-03-21T04:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.534289 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.538611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.538649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.538659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.538675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.538686 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:59Z","lastTransitionTime":"2026-03-21T04:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.551676 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.555283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.555316 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.555326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.555340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.555358 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:59Z","lastTransitionTime":"2026-03-21T04:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.570165 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.573918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.573972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.573984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.574000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.574011 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:49:59Z","lastTransitionTime":"2026-03-21T04:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.586568 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:49:59Z is after 2025-08-24T17:21:41Z" Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.586693 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.661427 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.661557 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.661880 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.661947 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:49:59 crc kubenswrapper[4775]: I0321 04:49:59.662111 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:49:59 crc kubenswrapper[4775]: E0321 04:49:59.662200 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:00 crc kubenswrapper[4775]: I0321 04:50:00.660456 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:00 crc kubenswrapper[4775]: E0321 04:50:00.660561 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:01 crc kubenswrapper[4775]: I0321 04:50:01.660675 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:01 crc kubenswrapper[4775]: E0321 04:50:01.660821 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:01 crc kubenswrapper[4775]: I0321 04:50:01.660900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:01 crc kubenswrapper[4775]: I0321 04:50:01.661326 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:01 crc kubenswrapper[4775]: E0321 04:50:01.661270 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:01 crc kubenswrapper[4775]: E0321 04:50:01.661488 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:02 crc kubenswrapper[4775]: I0321 04:50:02.660962 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:02 crc kubenswrapper[4775]: E0321 04:50:02.661178 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:03 crc kubenswrapper[4775]: I0321 04:50:03.661016 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:03 crc kubenswrapper[4775]: I0321 04:50:03.661014 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:03 crc kubenswrapper[4775]: I0321 04:50:03.661376 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:03 crc kubenswrapper[4775]: E0321 04:50:03.661239 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:03 crc kubenswrapper[4775]: E0321 04:50:03.661477 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:03 crc kubenswrapper[4775]: E0321 04:50:03.661680 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:03 crc kubenswrapper[4775]: E0321 04:50:03.763619 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:04 crc kubenswrapper[4775]: I0321 04:50:04.660435 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:04 crc kubenswrapper[4775]: E0321 04:50:04.660616 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:04 crc kubenswrapper[4775]: I0321 04:50:04.661394 4775 scope.go:117] "RemoveContainer" containerID="a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b" Mar 21 04:50:04 crc kubenswrapper[4775]: I0321 04:50:04.996858 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/2.log" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.000630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.002590 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.020406 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.036050 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.050759 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.064304 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.076578 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.099401 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.111714 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.129141 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:50:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.139983 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.153384 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdcbc41e-a246-40f6-bf50-36d3c217315a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf88511498eb0f6182819ecb78616730fa9bfbe187057ea891f3ba73e550287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.167381 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.193137 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.204345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.242680 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.265796 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.287917 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058178a5-0360-40e4-9baf-5478bac349dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5615480a5b452c6bce19fa04738bb5d68984f62258c9486fc2ae787cfb9b7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:47:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:47:29.750929 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:47:29.753390 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:47:29.787802 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:47:29.793292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:47:55.400421 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:47:55.400543 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb44c7daffbeddbb39314ec11bd3235996e19b520875337dcac1ffd14f35953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9299f9db8cbdd2dbd04e47bd1eba4f95724ec3dd04ec5083564dc0c38a960457\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.301264 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.316248 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.327322 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.660883 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.660919 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:05 crc kubenswrapper[4775]: I0321 04:50:05.661047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:05 crc kubenswrapper[4775]: E0321 04:50:05.661283 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:05 crc kubenswrapper[4775]: E0321 04:50:05.661408 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:05 crc kubenswrapper[4775]: E0321 04:50:05.661531 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.006036 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/3.log" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.006751 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/2.log" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.009939 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" exitCode=1 Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.009983 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.010030 4775 scope.go:117] "RemoveContainer" containerID="a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.010687 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 04:50:06 crc kubenswrapper[4775]: E0321 04:50:06.010951 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.031664 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.047282 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.060317 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.073089 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.095234 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.108641 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.128349 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3a8e35d65afe6149f7a4d22b0c351839afb61b412d54dfdcd93311c341d263b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:34Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949601 7032 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd-operator/metrics]} name:Service_openshift-etcd-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0321 04:49:33.949663 7032 factory.go:656] Stopping watch factory\\\\nI0321 04:49:33.949677 7032 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:49:33.949692 7032 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:49:33.949770 7032 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:50:05Z\\\",\\\"message\\\":\\\"05.638184 7377 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0321 04:50:05.638209 7377 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:50:05.638213 7377 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz openshift-multus/multus-556rg openshift-network-node-identity/network-node-identity-vrzqb openshift-network-opera\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:50:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.140190 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.154517 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.168105 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.178386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.190429 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.200295 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.211633 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.222209 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdcbc41e-a246-40f6-bf50-36d3c217315a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf88511498eb0f6182819ecb78616730fa9bfbe187057ea891f3ba73e550287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.232947 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.247817 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.260446 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.273451 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058178a5-0360-40e4-9baf-5478bac349dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5615480a5b452c6bce19fa04738bb5d68984f62258c9486fc2ae787cfb9b7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:47:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:47:29.750929 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:47:29.753390 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:47:29.787802 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:47:29.793292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:47:55.400421 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:47:55.400543 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb44c7daffbeddbb39314ec11bd3235996e19b520875337dcac1ffd14f35953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9299f9db8cbdd2dbd04e47bd1eba4f95724ec3dd04ec5083564dc0c38a960457\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:06 crc kubenswrapper[4775]: I0321 04:50:06.660974 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:06 crc kubenswrapper[4775]: E0321 04:50:06.661215 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.014309 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/3.log" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.018186 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 04:50:07 crc kubenswrapper[4775]: E0321 04:50:07.018355 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.034080 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.046916 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.057152 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.072836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.093730 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.105345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.132272 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:50:05Z\\\",\\\"message\\\":\\\"05.638184 7377 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0321 04:50:05.638209 7377 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:50:05.638213 7377 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz openshift-multus/multus-556rg openshift-network-node-identity/network-node-identity-vrzqb openshift-network-opera\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:50:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.145809 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.160567 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.173484 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.186355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.198608 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.209213 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.219945 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.229670 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdcbc41e-a246-40f6-bf50-36d3c217315a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf88511498eb0f6182819ecb78616730fa9bfbe187057ea891f3ba73e550287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.239277 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.254554 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.266547 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.277471 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058178a5-0360-40e4-9baf-5478bac349dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5615480a5b452c6bce19fa04738bb5d68984f62258c9486fc2ae787cfb9b7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:47:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:47:29.750929 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:47:29.753390 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:47:29.787802 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:47:29.793292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:47:55.400421 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:47:55.400543 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb44c7daffbeddbb39314ec11bd3235996e19b520875337dcac1ffd14f35953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9299f9db8cbdd2dbd04e47bd1eba4f95724ec3dd04ec5083564dc0c38a960457\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.661327 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.661375 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:07 crc kubenswrapper[4775]: E0321 04:50:07.661475 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.661579 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:07 crc kubenswrapper[4775]: E0321 04:50:07.661655 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:07 crc kubenswrapper[4775]: E0321 04:50:07.661807 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.686183 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.705673 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.722638 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.738249 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.752294 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.779550 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.795901 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.816571 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:50:05Z\\\",\\\"message\\\":\\\"05.638184 7377 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0321 04:50:05.638209 7377 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:50:05.638213 7377 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz openshift-multus/multus-556rg openshift-network-node-identity/network-node-identity-vrzqb openshift-network-opera\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:50:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.831650 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.846190 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdcbc41e-a246-40f6-bf50-36d3c217315a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf88511498eb0f6182819ecb78616730fa9bfbe187057ea891f3ba73e550287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.861922 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.876246 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.890930 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.907295 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.923974 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.942949 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058178a5-0360-40e4-9baf-5478bac349dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5615480a5b452c6bce19fa04738bb5d68984f62258c9486fc2ae787cfb9b7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:47:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:47:29.750929 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:47:29.753390 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:47:29.787802 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:47:29.793292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:47:55.400421 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:47:55.400543 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb44c7daffbeddbb39314ec11bd3235996e19b520875337dcac1ffd14f35953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9299f9db8cbdd2dbd04e47bd1eba4f95724ec3dd04ec5083564dc0c38a960457\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.955754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.972895 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:07 crc kubenswrapper[4775]: I0321 04:50:07.989021 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:07Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:08 crc kubenswrapper[4775]: I0321 04:50:08.660363 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:08 crc kubenswrapper[4775]: E0321 04:50:08.660693 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:08 crc kubenswrapper[4775]: E0321 04:50:08.764453 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.661203 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.661203 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.661367 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.661465 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.661222 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.661568 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.823469 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.823509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.823521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.823536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.823546 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:09Z","lastTransitionTime":"2026-03-21T04:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.836071 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.840111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.840161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.840174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.840188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.840198 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:09Z","lastTransitionTime":"2026-03-21T04:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.854182 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.858390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.858423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.858643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.858663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.858675 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:09Z","lastTransitionTime":"2026-03-21T04:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.874367 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.877515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.877545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.877558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.877574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.877586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:09Z","lastTransitionTime":"2026-03-21T04:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.888281 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.891372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.891408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.891420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.891436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:09 crc kubenswrapper[4775]: I0321 04:50:09.891448 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:09Z","lastTransitionTime":"2026-03-21T04:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.902985 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:09Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:09 crc kubenswrapper[4775]: E0321 04:50:09.903107 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:50:10 crc kubenswrapper[4775]: I0321 04:50:10.660364 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:10 crc kubenswrapper[4775]: E0321 04:50:10.660761 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:11 crc kubenswrapper[4775]: I0321 04:50:11.661008 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:11 crc kubenswrapper[4775]: I0321 04:50:11.661043 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:11 crc kubenswrapper[4775]: I0321 04:50:11.661105 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:11 crc kubenswrapper[4775]: E0321 04:50:11.661246 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:11 crc kubenswrapper[4775]: E0321 04:50:11.661331 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:11 crc kubenswrapper[4775]: E0321 04:50:11.661408 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:12 crc kubenswrapper[4775]: I0321 04:50:12.660392 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:12 crc kubenswrapper[4775]: E0321 04:50:12.660581 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:13 crc kubenswrapper[4775]: I0321 04:50:13.660462 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:13 crc kubenswrapper[4775]: E0321 04:50:13.660618 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:13 crc kubenswrapper[4775]: I0321 04:50:13.660728 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:13 crc kubenswrapper[4775]: E0321 04:50:13.660799 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:13 crc kubenswrapper[4775]: I0321 04:50:13.660875 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:13 crc kubenswrapper[4775]: E0321 04:50:13.660932 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:13 crc kubenswrapper[4775]: E0321 04:50:13.766214 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:14 crc kubenswrapper[4775]: I0321 04:50:14.660851 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:14 crc kubenswrapper[4775]: E0321 04:50:14.661007 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:15 crc kubenswrapper[4775]: I0321 04:50:15.661243 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:15 crc kubenswrapper[4775]: I0321 04:50:15.661375 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:15 crc kubenswrapper[4775]: I0321 04:50:15.661255 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:15 crc kubenswrapper[4775]: E0321 04:50:15.661368 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:15 crc kubenswrapper[4775]: E0321 04:50:15.661549 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:15 crc kubenswrapper[4775]: E0321 04:50:15.661576 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:16 crc kubenswrapper[4775]: I0321 04:50:16.660734 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:16 crc kubenswrapper[4775]: E0321 04:50:16.661078 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.660933 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.661005 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.660939 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:17 crc kubenswrapper[4775]: E0321 04:50:17.661204 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:17 crc kubenswrapper[4775]: E0321 04:50:17.661326 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:17 crc kubenswrapper[4775]: E0321 04:50:17.661431 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.685785 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a84635b7-13d6-414b-b564-df905a75f78e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd489dc7122bc77a71fedd534450c46432fd565c47c5a1bf0adbdbf692ff5846\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06b2cb55b98ba78b43ee20318d2240093a0ac190d08959d632fae08c0503d987\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ef017956ac1ac2629ec584876a1730d53fdb87672733c03f24391ba91530939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://202a0f82cbe6c8445fcb51a0878e276075a419dec4a93210947380a97da1bf56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf1328894b894eaf46c2d09d835b36aa7e931a51a0ab34bc0c7bb9b7b611e9a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1474ef03c902f50e4c4565e4e251b19a9126b43e3918cc89871b86a499fa057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://027439ea9557ee91465a3e603ef3d3d050016c9a92b039ede3cf6c6c17813539\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://336231c7d1d5fb22fd7ce69748f0263b9b20a45a6c60d3db6864d29960909645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.700526 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.723737 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a69d31f5-deeb-4860-be96-ed5547831685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:50:05Z\\\",\\\"message\\\":\\\"05.638184 7377 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0321 04:50:05.638209 7377 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:05Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:50:05.638213 7377 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-kube-apiserver/kube-apiserver-crc openshift-machine-config-operator/kube-rbac-proxy-crio-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz openshift-multus/multus-556rg openshift-network-node-identity/network-node-identity-vrzqb openshift-network-opera\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:50:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7w6b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mzqtk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.738225 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a197f323-38cd-41a6-ae73-5d16706c9224\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f04df41528bb781c23382f46200bd21b1d0e6064f535d4d47cde4cabedef8c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71042d514c08b225f10915e55a0ef253281a017b05412ca810a5fbbfee272bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a3e2381fa82bd51ba8be325e0a83a387ebd790d6a91283d604ab46f9ae7d45e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99787fe5e44de1b5adaa5d1b49dcb6db27a4176d232dc67119b5f633bfa4c948\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.755875 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e4957b1b3b81f43f30ab64a060d6535b875e1aa8a77710f7c258245be089b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.771664 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ead4a6d7f69da8214783f61c88eef102aaa4b3cc06971e669dda923139ecbd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e479f2f8f2d9e9576d61a6e3ab302409f350cc28c555046c1ca6431307afaecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.788541 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.805264 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-556rg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e77ec218-42da-4f07-b214-184c4f3b20f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:49:52Z\\\",\\\"message\\\":\\\"2026-03-21T04:49:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f\\\\n2026-03-21T04:49:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_71b8a2db-de4c-4816-881e-455a501c594f to /host/opt/cni/bin/\\\\n2026-03-21T04:49:07Z [verbose] multus-daemon started\\\\n2026-03-21T04:49:07Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:49:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vft22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-556rg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.817927 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cc87971f-e8fc-454d-8513-957a0bbad389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88f7eb6f5b1fecd00254bb66366d1fa1d90231b1927926acd0cc69bbb2f45fc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2kj8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:04Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.829946 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6920413a-2c51-466d-a16e-d14489ae0c6c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h68dp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:15Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xk9f5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.843266 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdcbc41e-a246-40f6-bf50-36d3c217315a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0cf88511498eb0f6182819ecb78616730fa9bfbe187057ea891f3ba73e550287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://375572b0f0f7a6917d62ef1138d8651eb2d6118d420fc647d1d24c29e793f17d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.857005 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-khh7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70cca5ae-9de4-4933-a6f8-4a23ab711bbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb124fa0b2fc2582e8badf94a43d71a33b8865c5c466ab3bdcbb57e696de627e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwtgh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:01Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-khh7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.873579 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kldzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"957cde70-ca20-438a-a4bf-42481dddb2db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12fc5d649ea6218fcab145822c0f2cd86e0513c53c81213eb7ed68fb3afde674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10f60b38b40f94cc0e8e5cb4be791449552e2a54d23904a16227672eed7b3b18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fb5be104ab810117cde44c92e3a7dfe6ea7d7d1c728d89b6172379ecb1bbc7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7261c2582b4756b7bf4c3acf67391dd7f0c478ed849b253ba9c511876a00785\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://362f187c90be3c63ca293d844a192b1ddcdbca38ac34a0f59ce1faeb048e0de5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b8be7f3b8b80c96c9bc89fcf927dbfefabd735ed217d85ac200b42b8d069fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b07461492cde5a45e3db3c269dc79fb45d9340b15b69375d11512fba609750f1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:49:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:49:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqnl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kldzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.886281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"38cb395c-744c-4c2e-9e32-b6cb206a9c5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99822baef16853fef450fdf2be6c8bd67171afc4b0a34e172b108f57738717d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://226b410d4d80ed4dad83cf4c8e3d5c470f8d2cbae7852a11fde40d35d00ee235\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l4d4t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-62jtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.898754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058178a5-0360-40e4-9baf-5478bac349dd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5615480a5b452c6bce19fa04738bb5d68984f62258c9486fc2ae787cfb9b7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851dab44c58218d203e180465f8a216bfeec8271e78102856d11545520feabb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:47:55Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:47:29.750929 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:47:29.753390 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:47:29.787802 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:47:29.793292 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:47:55.400421 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:47:55.400543 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb44c7daffbeddbb39314ec11bd3235996e19b520875337dcac1ffd14f35953\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9299f9db8cbdd2dbd04e47bd1eba4f95724ec3dd04ec5083564dc0c38a960457\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.912605 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.925631 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:48:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb93fe033377ef7e4bee2751d04558c879dea241df496c9222e2574272922143\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.938025 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cffcf487-ef41-4395-81eb-e5e6358f4a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3317fa0c2a16ea0f8c6be9630e18abbd9fa2a5b55f51ae7f60be585b898fc75c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9pxr5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:49:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qc7hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:17 crc kubenswrapper[4775]: I0321 04:50:17.952190 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbc8474b-4360-449d-ab37-ba14ca1ac5ed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:47:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:48:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0321 04:48:33.168319 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0321 04:48:33.168481 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:48:33.169188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3515854145/tls.crt::/tmp/serving-cert-3515854145/tls.key\\\\\\\"\\\\nI0321 04:48:33.461779 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0321 04:48:33.464865 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0321 04:48:33.464898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0321 04:48:33.464929 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0321 04:48:33.464945 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0321 04:48:33.471450 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0321 04:48:33.471478 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0321 04:48:33.471486 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0321 04:48:33.471486 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0321 04:48:33.471491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0321 04:48:33.471520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0321 04:48:33.471523 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0321 04:48:33.473616 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:48:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:49:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:47:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:47:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:47:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:47:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:18 crc kubenswrapper[4775]: I0321 04:50:18.660965 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:18 crc kubenswrapper[4775]: E0321 04:50:18.661096 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:18 crc kubenswrapper[4775]: E0321 04:50:18.767723 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.660745 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.660798 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.660825 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.661190 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.661289 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.661446 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.662256 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.662399 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.923893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.923937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.923949 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.923971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.923983 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:19Z","lastTransitionTime":"2026-03-21T04:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.940459 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:19Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.946585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.946677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.946698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.946776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.946847 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:19Z","lastTransitionTime":"2026-03-21T04:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.962534 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:19Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.962965 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.962814 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.963037 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs podName:6920413a-2c51-466d-a16e-d14489ae0c6c nodeName:}" failed. No retries permitted until 2026-03-21 04:51:23.963016644 +0000 UTC m=+236.939480268 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs") pod "network-metrics-daemon-xk9f5" (UID: "6920413a-2c51-466d-a16e-d14489ae0c6c") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.968283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.968319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.968331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.968349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.968361 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:19Z","lastTransitionTime":"2026-03-21T04:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:19 crc kubenswrapper[4775]: E0321 04:50:19.984473 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:19Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.988836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.988923 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.988941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.988964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:19 crc kubenswrapper[4775]: I0321 04:50:19.988983 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:19Z","lastTransitionTime":"2026-03-21T04:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:20 crc kubenswrapper[4775]: E0321 04:50:20.003452 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:20Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:20 crc kubenswrapper[4775]: I0321 04:50:20.007491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:20 crc kubenswrapper[4775]: I0321 04:50:20.007548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:20 crc kubenswrapper[4775]: I0321 04:50:20.007565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:20 crc kubenswrapper[4775]: I0321 04:50:20.007586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:20 crc kubenswrapper[4775]: I0321 04:50:20.007604 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:20Z","lastTransitionTime":"2026-03-21T04:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:20 crc kubenswrapper[4775]: E0321 04:50:20.027422 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:50:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"82de5c1b-41dd-43ea-8a7a-ca3ee73d6e07\\\",\\\"systemUUID\\\":\\\"b8f8f0e2-c78b-43d3-976e-2a86ca08a185\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:50:20Z is after 2025-08-24T17:21:41Z" Mar 21 04:50:20 crc kubenswrapper[4775]: E0321 04:50:20.027690 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:50:20 crc kubenswrapper[4775]: I0321 04:50:20.661364 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:20 crc kubenswrapper[4775]: E0321 04:50:20.661527 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:21 crc kubenswrapper[4775]: I0321 04:50:21.660929 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:21 crc kubenswrapper[4775]: E0321 04:50:21.661060 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:21 crc kubenswrapper[4775]: I0321 04:50:21.661240 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:21 crc kubenswrapper[4775]: I0321 04:50:21.661267 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:21 crc kubenswrapper[4775]: E0321 04:50:21.661738 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:21 crc kubenswrapper[4775]: E0321 04:50:21.661894 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:22 crc kubenswrapper[4775]: I0321 04:50:22.660716 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:22 crc kubenswrapper[4775]: E0321 04:50:22.661040 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:23 crc kubenswrapper[4775]: I0321 04:50:23.660586 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:23 crc kubenswrapper[4775]: I0321 04:50:23.660686 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:23 crc kubenswrapper[4775]: E0321 04:50:23.660723 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:23 crc kubenswrapper[4775]: E0321 04:50:23.660827 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:23 crc kubenswrapper[4775]: I0321 04:50:23.660851 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:23 crc kubenswrapper[4775]: E0321 04:50:23.661035 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:23 crc kubenswrapper[4775]: E0321 04:50:23.768531 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:24 crc kubenswrapper[4775]: I0321 04:50:24.661165 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:24 crc kubenswrapper[4775]: E0321 04:50:24.662227 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:25 crc kubenswrapper[4775]: I0321 04:50:25.660631 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:25 crc kubenswrapper[4775]: I0321 04:50:25.660641 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:25 crc kubenswrapper[4775]: E0321 04:50:25.660753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:25 crc kubenswrapper[4775]: I0321 04:50:25.660708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:25 crc kubenswrapper[4775]: E0321 04:50:25.660820 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:25 crc kubenswrapper[4775]: E0321 04:50:25.660922 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:26 crc kubenswrapper[4775]: I0321 04:50:26.661273 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:26 crc kubenswrapper[4775]: E0321 04:50:26.661425 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.660565 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:27 crc kubenswrapper[4775]: E0321 04:50:27.660730 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.660945 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.660982 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:27 crc kubenswrapper[4775]: E0321 04:50:27.661031 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:27 crc kubenswrapper[4775]: E0321 04:50:27.661190 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.688340 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.688325228 podStartE2EDuration="1m29.688325228s" podCreationTimestamp="2026-03-21 04:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.687913807 +0000 UTC m=+180.664377451" watchObservedRunningTime="2026-03-21 04:50:27.688325228 +0000 UTC m=+180.664788842" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.741450 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podStartSLOduration=121.741432741 podStartE2EDuration="2m1.741432741s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.726300804 +0000 UTC m=+180.702764428" watchObservedRunningTime="2026-03-21 04:50:27.741432741 +0000 UTC m=+180.717896365" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.772628 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.7726097 podStartE2EDuration="44.7726097s" podCreationTimestamp="2026-03-21 04:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.741600106 +0000 UTC m=+180.718063720" watchObservedRunningTime="2026-03-21 04:50:27.7726097 +0000 UTC m=+180.749073324" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.784942 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=94.784918416 podStartE2EDuration="1m34.784918416s" podCreationTimestamp="2026-03-21 04:48:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.772448196 +0000 UTC m=+180.748911820" watchObservedRunningTime="2026-03-21 04:50:27.784918416 +0000 UTC m=+180.761382040" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.818683 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.818667589 podStartE2EDuration="31.818667589s" podCreationTimestamp="2026-03-21 04:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.818405622 +0000 UTC m=+180.794869246" watchObservedRunningTime="2026-03-21 04:50:27.818667589 +0000 UTC m=+180.795131213" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.876872 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-556rg" podStartSLOduration=121.876841218 podStartE2EDuration="2m1.876841218s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.876798187 +0000 UTC m=+180.853261821" watchObservedRunningTime="2026-03-21 04:50:27.876841218 +0000 UTC m=+180.853304842" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.885112 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kr988" podStartSLOduration=121.885093286 podStartE2EDuration="2m1.885093286s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.885074486 +0000 UTC m=+180.861538110" watchObservedRunningTime="2026-03-21 04:50:27.885093286 +0000 UTC m=+180.861556920" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.908730 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=33.908687757 podStartE2EDuration="33.908687757s" podCreationTimestamp="2026-03-21 04:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.907556734 +0000 UTC m=+180.884020358" watchObservedRunningTime="2026-03-21 04:50:27.908687757 +0000 UTC m=+180.885151401" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.919915 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-khh7x" podStartSLOduration=121.91989547 podStartE2EDuration="2m1.91989547s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.919154159 +0000 UTC m=+180.895617793" watchObservedRunningTime="2026-03-21 04:50:27.91989547 +0000 UTC m=+180.896359094" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.935456 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kldzh" podStartSLOduration=121.935441159 podStartE2EDuration="2m1.935441159s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.934861632 +0000 UTC m=+180.911325256" watchObservedRunningTime="2026-03-21 04:50:27.935441159 +0000 UTC m=+180.911904783" Mar 21 04:50:27 crc kubenswrapper[4775]: I0321 04:50:27.950032 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-62jtz" podStartSLOduration=121.950014929 podStartE2EDuration="2m1.950014929s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:27.949067802 +0000 UTC m=+180.925531426" watchObservedRunningTime="2026-03-21 04:50:27.950014929 +0000 UTC m=+180.926478553" Mar 21 04:50:28 crc kubenswrapper[4775]: I0321 04:50:28.660371 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:28 crc kubenswrapper[4775]: E0321 04:50:28.660529 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:28 crc kubenswrapper[4775]: E0321 04:50:28.770306 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:29 crc kubenswrapper[4775]: I0321 04:50:29.661194 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:29 crc kubenswrapper[4775]: E0321 04:50:29.661344 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:29 crc kubenswrapper[4775]: I0321 04:50:29.661472 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:29 crc kubenswrapper[4775]: I0321 04:50:29.661592 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:29 crc kubenswrapper[4775]: E0321 04:50:29.661630 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:29 crc kubenswrapper[4775]: E0321 04:50:29.661826 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.378432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.378482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.378497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.378514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.378528 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:50:30Z","lastTransitionTime":"2026-03-21T04:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.424377 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8"] Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.424954 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.427361 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.427383 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.427431 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.427484 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.453694 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9758cbde-aaf6-4097-b10e-233d3efd75f5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.453744 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9758cbde-aaf6-4097-b10e-233d3efd75f5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.453799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9758cbde-aaf6-4097-b10e-233d3efd75f5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.453875 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9758cbde-aaf6-4097-b10e-233d3efd75f5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.453932 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9758cbde-aaf6-4097-b10e-233d3efd75f5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.555055 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9758cbde-aaf6-4097-b10e-233d3efd75f5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.555190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9758cbde-aaf6-4097-b10e-233d3efd75f5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.555259 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9758cbde-aaf6-4097-b10e-233d3efd75f5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.555352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9758cbde-aaf6-4097-b10e-233d3efd75f5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.555405 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9758cbde-aaf6-4097-b10e-233d3efd75f5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.555496 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9758cbde-aaf6-4097-b10e-233d3efd75f5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.555552 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9758cbde-aaf6-4097-b10e-233d3efd75f5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.556373 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9758cbde-aaf6-4097-b10e-233d3efd75f5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.560894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9758cbde-aaf6-4097-b10e-233d3efd75f5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.580857 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9758cbde-aaf6-4097-b10e-233d3efd75f5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fwjf8\" (UID: \"9758cbde-aaf6-4097-b10e-233d3efd75f5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.660615 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:30 crc kubenswrapper[4775]: E0321 04:50:30.660734 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.740388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.786264 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 21 04:50:30 crc kubenswrapper[4775]: I0321 04:50:30.794241 4775 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:50:31 crc kubenswrapper[4775]: I0321 04:50:31.091347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" event={"ID":"9758cbde-aaf6-4097-b10e-233d3efd75f5","Type":"ContainerStarted","Data":"f6c562abdcccdfd9d6df459e61ea1b3bee4017e7f28176b6a472e2488d95f2ae"} Mar 21 04:50:31 crc kubenswrapper[4775]: I0321 04:50:31.091400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" event={"ID":"9758cbde-aaf6-4097-b10e-233d3efd75f5","Type":"ContainerStarted","Data":"104c2205710b17c9c323460ac909eca961b1f4987df297698654484e316c551c"} Mar 21 04:50:31 crc kubenswrapper[4775]: I0321 04:50:31.103856 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fwjf8" podStartSLOduration=125.10383472 podStartE2EDuration="2m5.10383472s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:31.103713836 +0000 UTC m=+184.080177470" watchObservedRunningTime="2026-03-21 04:50:31.10383472 +0000 UTC m=+184.080298344" Mar 21 04:50:31 crc kubenswrapper[4775]: I0321 04:50:31.661201 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:31 crc kubenswrapper[4775]: I0321 04:50:31.661246 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:31 crc kubenswrapper[4775]: I0321 04:50:31.661201 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:31 crc kubenswrapper[4775]: E0321 04:50:31.661325 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:31 crc kubenswrapper[4775]: E0321 04:50:31.661396 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:31 crc kubenswrapper[4775]: E0321 04:50:31.661496 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:32 crc kubenswrapper[4775]: I0321 04:50:32.660879 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:32 crc kubenswrapper[4775]: E0321 04:50:32.661113 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:33 crc kubenswrapper[4775]: I0321 04:50:33.661397 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:33 crc kubenswrapper[4775]: I0321 04:50:33.661408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:33 crc kubenswrapper[4775]: E0321 04:50:33.661554 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:33 crc kubenswrapper[4775]: E0321 04:50:33.661717 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:33 crc kubenswrapper[4775]: I0321 04:50:33.661840 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:33 crc kubenswrapper[4775]: E0321 04:50:33.662097 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:33 crc kubenswrapper[4775]: I0321 04:50:33.662488 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 04:50:33 crc kubenswrapper[4775]: E0321 04:50:33.662692 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mzqtk_openshift-ovn-kubernetes(a69d31f5-deeb-4860-be96-ed5547831685)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" Mar 21 04:50:33 crc kubenswrapper[4775]: E0321 04:50:33.771504 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:34 crc kubenswrapper[4775]: I0321 04:50:34.660703 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:34 crc kubenswrapper[4775]: E0321 04:50:34.660838 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:35 crc kubenswrapper[4775]: I0321 04:50:35.661144 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:35 crc kubenswrapper[4775]: I0321 04:50:35.661173 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:35 crc kubenswrapper[4775]: I0321 04:50:35.661232 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:35 crc kubenswrapper[4775]: E0321 04:50:35.661527 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:35 crc kubenswrapper[4775]: E0321 04:50:35.661623 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:35 crc kubenswrapper[4775]: E0321 04:50:35.661361 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:36 crc kubenswrapper[4775]: I0321 04:50:36.661547 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:36 crc kubenswrapper[4775]: E0321 04:50:36.661707 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:37 crc kubenswrapper[4775]: I0321 04:50:37.661103 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:37 crc kubenswrapper[4775]: I0321 04:50:37.661209 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:37 crc kubenswrapper[4775]: I0321 04:50:37.661286 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:37 crc kubenswrapper[4775]: E0321 04:50:37.663065 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:37 crc kubenswrapper[4775]: E0321 04:50:37.663212 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:37 crc kubenswrapper[4775]: E0321 04:50:37.663405 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:38 crc kubenswrapper[4775]: I0321 04:50:38.660483 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:38 crc kubenswrapper[4775]: E0321 04:50:38.660989 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:38 crc kubenswrapper[4775]: E0321 04:50:38.772777 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.117033 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/1.log" Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.117837 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/0.log" Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.117904 4775 generic.go:334] "Generic (PLEG): container finished" podID="e77ec218-42da-4f07-b214-184c4f3b20f3" containerID="eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae" exitCode=1 Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.117954 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerDied","Data":"eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae"} Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.118008 4775 scope.go:117] "RemoveContainer" containerID="77509499e5f9ecba799d75a9e9f99a1b2ad760f16f11a518f1d8bd7f848acabf" Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.118526 4775 scope.go:117] "RemoveContainer" containerID="eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae" Mar 21 04:50:39 crc kubenswrapper[4775]: E0321 04:50:39.118709 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-556rg_openshift-multus(e77ec218-42da-4f07-b214-184c4f3b20f3)\"" pod="openshift-multus/multus-556rg" podUID="e77ec218-42da-4f07-b214-184c4f3b20f3" Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.661500 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.661611 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:39 crc kubenswrapper[4775]: I0321 04:50:39.661964 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:39 crc kubenswrapper[4775]: E0321 04:50:39.662091 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:39 crc kubenswrapper[4775]: E0321 04:50:39.662293 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:39 crc kubenswrapper[4775]: E0321 04:50:39.662383 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:40 crc kubenswrapper[4775]: I0321 04:50:40.121998 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/1.log" Mar 21 04:50:40 crc kubenswrapper[4775]: I0321 04:50:40.660898 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:40 crc kubenswrapper[4775]: E0321 04:50:40.661065 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:41 crc kubenswrapper[4775]: I0321 04:50:41.660505 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:41 crc kubenswrapper[4775]: I0321 04:50:41.660592 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:41 crc kubenswrapper[4775]: E0321 04:50:41.660664 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:41 crc kubenswrapper[4775]: I0321 04:50:41.660499 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:41 crc kubenswrapper[4775]: E0321 04:50:41.660912 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:41 crc kubenswrapper[4775]: E0321 04:50:41.660976 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:42 crc kubenswrapper[4775]: I0321 04:50:42.660331 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:42 crc kubenswrapper[4775]: E0321 04:50:42.660535 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:43 crc kubenswrapper[4775]: I0321 04:50:43.660800 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:43 crc kubenswrapper[4775]: I0321 04:50:43.660840 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:43 crc kubenswrapper[4775]: E0321 04:50:43.661065 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:43 crc kubenswrapper[4775]: E0321 04:50:43.661097 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:43 crc kubenswrapper[4775]: I0321 04:50:43.660861 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:43 crc kubenswrapper[4775]: E0321 04:50:43.661196 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:43 crc kubenswrapper[4775]: E0321 04:50:43.774668 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:44 crc kubenswrapper[4775]: I0321 04:50:44.661315 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:44 crc kubenswrapper[4775]: E0321 04:50:44.661442 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:45 crc kubenswrapper[4775]: I0321 04:50:45.660920 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:45 crc kubenswrapper[4775]: I0321 04:50:45.661193 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:45 crc kubenswrapper[4775]: I0321 04:50:45.661210 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:45 crc kubenswrapper[4775]: E0321 04:50:45.661573 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:45 crc kubenswrapper[4775]: E0321 04:50:45.661655 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:45 crc kubenswrapper[4775]: E0321 04:50:45.661687 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:46 crc kubenswrapper[4775]: I0321 04:50:46.660321 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:46 crc kubenswrapper[4775]: E0321 04:50:46.660707 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:47 crc kubenswrapper[4775]: I0321 04:50:47.660299 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:47 crc kubenswrapper[4775]: E0321 04:50:47.661633 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:47 crc kubenswrapper[4775]: I0321 04:50:47.661677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:47 crc kubenswrapper[4775]: I0321 04:50:47.661733 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:47 crc kubenswrapper[4775]: E0321 04:50:47.661877 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:47 crc kubenswrapper[4775]: E0321 04:50:47.661939 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:48 crc kubenswrapper[4775]: I0321 04:50:48.661501 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:48 crc kubenswrapper[4775]: E0321 04:50:48.662024 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:48 crc kubenswrapper[4775]: I0321 04:50:48.662279 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 04:50:48 crc kubenswrapper[4775]: E0321 04:50:48.775943 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.182066 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/3.log" Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.184581 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerStarted","Data":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.185033 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.213866 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podStartSLOduration=143.213839381 podStartE2EDuration="2m23.213839381s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:50:49.211452582 +0000 UTC m=+202.187916206" watchObservedRunningTime="2026-03-21 04:50:49.213839381 +0000 UTC m=+202.190303005" Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.519295 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xk9f5"] Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.519411 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:49 crc kubenswrapper[4775]: E0321 04:50:49.519486 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.661218 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:49 crc kubenswrapper[4775]: I0321 04:50:49.661293 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:49 crc kubenswrapper[4775]: E0321 04:50:49.661336 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:49 crc kubenswrapper[4775]: E0321 04:50:49.661485 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:50 crc kubenswrapper[4775]: I0321 04:50:50.661100 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:50 crc kubenswrapper[4775]: E0321 04:50:50.661267 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:51 crc kubenswrapper[4775]: I0321 04:50:51.661312 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:51 crc kubenswrapper[4775]: I0321 04:50:51.661318 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:51 crc kubenswrapper[4775]: E0321 04:50:51.661573 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:51 crc kubenswrapper[4775]: I0321 04:50:51.661611 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:51 crc kubenswrapper[4775]: E0321 04:50:51.661829 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:51 crc kubenswrapper[4775]: E0321 04:50:51.661906 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:52 crc kubenswrapper[4775]: I0321 04:50:52.661014 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:52 crc kubenswrapper[4775]: E0321 04:50:52.661353 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:53 crc kubenswrapper[4775]: I0321 04:50:53.660681 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:53 crc kubenswrapper[4775]: I0321 04:50:53.660786 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:53 crc kubenswrapper[4775]: I0321 04:50:53.660825 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:53 crc kubenswrapper[4775]: E0321 04:50:53.660966 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:53 crc kubenswrapper[4775]: E0321 04:50:53.661211 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:53 crc kubenswrapper[4775]: I0321 04:50:53.661221 4775 scope.go:117] "RemoveContainer" containerID="eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae" Mar 21 04:50:53 crc kubenswrapper[4775]: E0321 04:50:53.661266 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:53 crc kubenswrapper[4775]: E0321 04:50:53.778144 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:50:54 crc kubenswrapper[4775]: I0321 04:50:54.213641 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/1.log" Mar 21 04:50:54 crc kubenswrapper[4775]: I0321 04:50:54.213713 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerStarted","Data":"ad12949f26afe1756d4a6c0d01069cb26a8928bb36cc24602d4ab1bbde117f9e"} Mar 21 04:50:54 crc kubenswrapper[4775]: I0321 04:50:54.660534 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:54 crc kubenswrapper[4775]: E0321 04:50:54.660753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:55 crc kubenswrapper[4775]: I0321 04:50:55.660927 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:55 crc kubenswrapper[4775]: I0321 04:50:55.660927 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:55 crc kubenswrapper[4775]: I0321 04:50:55.660987 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:55 crc kubenswrapper[4775]: E0321 04:50:55.662009 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:55 crc kubenswrapper[4775]: E0321 04:50:55.662369 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:55 crc kubenswrapper[4775]: E0321 04:50:55.662405 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:56 crc kubenswrapper[4775]: I0321 04:50:56.660758 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:56 crc kubenswrapper[4775]: E0321 04:50:56.660897 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:57 crc kubenswrapper[4775]: I0321 04:50:57.660936 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:57 crc kubenswrapper[4775]: I0321 04:50:57.660946 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:57 crc kubenswrapper[4775]: E0321 04:50:57.663392 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:50:57 crc kubenswrapper[4775]: I0321 04:50:57.663646 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:57 crc kubenswrapper[4775]: E0321 04:50:57.663742 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk9f5" podUID="6920413a-2c51-466d-a16e-d14489ae0c6c" Mar 21 04:50:57 crc kubenswrapper[4775]: E0321 04:50:57.663926 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:50:58 crc kubenswrapper[4775]: I0321 04:50:58.660968 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:50:58 crc kubenswrapper[4775]: E0321 04:50:58.661102 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:50:59 crc kubenswrapper[4775]: I0321 04:50:59.661388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:50:59 crc kubenswrapper[4775]: I0321 04:50:59.661490 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:50:59 crc kubenswrapper[4775]: I0321 04:50:59.661439 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:50:59 crc kubenswrapper[4775]: I0321 04:50:59.664380 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:50:59 crc kubenswrapper[4775]: I0321 04:50:59.664896 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:50:59 crc kubenswrapper[4775]: I0321 04:50:59.665302 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:50:59 crc kubenswrapper[4775]: I0321 04:50:59.667764 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.418338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:00 crc kubenswrapper[4775]: E0321 04:51:00.418616 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:53:02.418581556 +0000 UTC m=+335.395045180 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.419933 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.419989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:51:00 crc kubenswrapper[4775]: E0321 04:51:00.420176 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:51:00 crc kubenswrapper[4775]: E0321 04:51:00.420224 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:02.420211963 +0000 UTC m=+335.396675587 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:51:00 crc kubenswrapper[4775]: E0321 04:51:00.420513 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:51:00 crc kubenswrapper[4775]: E0321 04:51:00.420678 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:53:02.420656776 +0000 UTC m=+335.397120400 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.520732 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.520779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.528057 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.529800 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.582172 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.600988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.660353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.663526 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.665008 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.831869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.877227 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.877890 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.878724 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dzxd5"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.878961 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.879277 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mthhs"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.879308 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.879549 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.879597 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.881048 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fc4z8"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.881467 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd86b"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.881920 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.882354 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.885854 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.886548 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6c6z"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.886937 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.887345 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.887500 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rz6g5"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.888053 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.888219 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wst2s"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.888638 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.889214 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.889460 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.899028 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.900754 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.900931 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5rnj6"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.901436 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.901755 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902014 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902362 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902583 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902700 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902724 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902803 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902818 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902823 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.902888 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903038 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903136 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903168 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903245 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903356 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903379 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903408 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903591 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903739 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.903940 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904225 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904272 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904304 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904378 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904412 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904441 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904532 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904593 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904645 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904700 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904725 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904792 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905000 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905022 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905053 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905096 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904507 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905204 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.904505 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905275 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905178 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905345 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905423 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905441 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905573 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905656 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905723 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.905776 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.906146 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.906426 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nstnr"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.907136 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.907870 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.908102 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nstnr" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.908193 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.909257 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.909267 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.911511 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.912202 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.912611 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.912710 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.912837 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.913040 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.913805 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-468g4"] Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.914421 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.921348 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.923716 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.923864 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.923924 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.923999 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.924018 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.924477 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.924591 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.924799 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.924848 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.924953 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.927193 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:51:00 crc kubenswrapper[4775]: I0321 04:51:00.927275 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.052086 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.052558 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.053227 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.053300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.053336 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-etcd-client\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.053364 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-encryption-config\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.053770 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.054078 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.055013 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.055536 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.055857 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.055878 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056039 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-config\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.055859 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056093 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3df1e1-4c22-48df-aaea-469c864f0310-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056158 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-config\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056187 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nbfq\" (UniqueName: \"kubernetes.io/projected/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-kube-api-access-7nbfq\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-trusted-ca-bundle\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056247 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-oauth-serving-cert\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056271 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkw4\" (UniqueName: \"kubernetes.io/projected/8bb7828f-6d99-4539-8312-c8e96bfbc608-kube-api-access-lqkw4\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056298 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2lnkp"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-client-ca\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056438 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-config\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056519 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e22b5c7-4191-4f21-82ba-3014ccc4e978-audit-dir\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056562 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edb4245a-7971-4c40-81b6-27d56b319a2f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-k5dks\" (UID: \"edb4245a-7971-4c40-81b6-27d56b319a2f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056623 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056651 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-trusted-ca\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056689 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/892d15b6-460e-4892-a836-0cc284c8a326-serving-cert\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056717 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056796 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056842 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-service-ca\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10a9b5d9-308c-4971-a073-9c88de98d8ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-serving-cert\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056921 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-config\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8bb7828f-6d99-4539-8312-c8e96bfbc608-audit-dir\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k97vh\" (UniqueName: \"kubernetes.io/projected/fe3df1e1-4c22-48df-aaea-469c864f0310-kube-api-access-k97vh\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-service-ca-bundle\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.056992 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e22b5c7-4191-4f21-82ba-3014ccc4e978-node-pullsecrets\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057010 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-dir\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057027 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhsgq\" (UniqueName: \"kubernetes.io/projected/c3d58eba-4ddf-463c-baa1-1943fb60c732-kube-api-access-bhsgq\") pod \"downloads-7954f5f757-nstnr\" (UID: \"c3d58eba-4ddf-463c-baa1-1943fb60c732\") " pod="openshift-console/downloads-7954f5f757-nstnr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057041 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntrqw\" (UniqueName: \"kubernetes.io/projected/768c9343-1391-4221-b008-5dee2921953f-kube-api-access-ntrqw\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zb5h\" (UniqueName: \"kubernetes.io/projected/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-kube-api-access-2zb5h\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057095 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057131 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-serving-cert\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057171 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10a9b5d9-308c-4971-a073-9c88de98d8ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057199 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-machine-approver-tls\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057247 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057275 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhtwp\" (UniqueName: \"kubernetes.io/projected/892d15b6-460e-4892-a836-0cc284c8a326-kube-api-access-mhtwp\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pflfp\" (UniqueName: \"kubernetes.io/projected/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-kube-api-access-pflfp\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057322 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057343 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-client-ca\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-serving-cert\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057390 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-serving-cert\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057401 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057462 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-config\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-serving-cert\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057504 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbpw\" (UniqueName: \"kubernetes.io/projected/edb4245a-7971-4c40-81b6-27d56b319a2f-kube-api-access-dvbpw\") pod \"cluster-samples-operator-665b6dd947-k5dks\" (UID: \"edb4245a-7971-4c40-81b6-27d56b319a2f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057526 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057576 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fw6r\" (UniqueName: \"kubernetes.io/projected/fddde3da-8512-4e62-9c38-b59f98e117e0-kube-api-access-4fw6r\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057598 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6fm\" (UniqueName: \"kubernetes.io/projected/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-kube-api-access-8m6fm\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057619 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/892d15b6-460e-4892-a836-0cc284c8a326-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057678 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-audit-policies\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057703 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a9b5d9-308c-4971-a073-9c88de98d8ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057724 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-etcd-client\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057746 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7h98\" (UniqueName: \"kubernetes.io/projected/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-kube-api-access-r7h98\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057797 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057819 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk47r\" (UniqueName: \"kubernetes.io/projected/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-kube-api-access-rk47r\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057846 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-config\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057866 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-encryption-config\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057875 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057888 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-oauth-config\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057910 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-image-import-ca\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057932 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3df1e1-4c22-48df-aaea-469c864f0310-config\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057954 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe3df1e1-4c22-48df-aaea-469c864f0310-images\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.057979 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4ls\" (UniqueName: \"kubernetes.io/projected/10a9b5d9-308c-4971-a073-9c88de98d8ea-kube-api-access-td4ls\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058000 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-auth-proxy-config\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-config\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-audit\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058067 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058088 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058111 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/768c9343-1391-4221-b008-5dee2921953f-serving-cert\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-serving-cert\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058178 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-policies\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058227 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-etcd-serving-ca\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75ck2\" (UniqueName: \"kubernetes.io/projected/6e22b5c7-4191-4f21-82ba-3014ccc4e978-kube-api-access-75ck2\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.058564 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hvd8t"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.059450 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.060511 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.061402 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.061480 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.061790 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.061957 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.061975 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.064018 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.064208 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.064304 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.064458 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.064752 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.065128 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.065778 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.066939 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.067636 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.067664 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.068154 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.068256 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.068505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.070472 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.072774 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.073078 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.073329 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.074201 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.074798 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.075068 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gd4gc"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.075745 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.077373 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.077413 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.078328 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.078671 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.079130 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.093455 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.095455 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.093946 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wrzs4"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.098151 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4nfs6"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.099073 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.100437 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.102486 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.105590 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.106591 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.107590 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.108736 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.108982 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vjv4q"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.109265 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.109940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.111260 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.111924 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567810-wb89g"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.112185 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.112596 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-wb89g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.115019 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.115708 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.115792 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.116401 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.116794 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.117050 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.117610 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2z46g"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.118823 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.119575 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.120230 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.120579 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.122803 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.123669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.127815 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.128499 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.129468 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.130840 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.131185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zs5pr"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.131991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.132603 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mthhs"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.134007 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.135350 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dzxd5"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.136837 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hvfsp"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.138280 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.139194 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.140383 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.140777 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gd4gc"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.142495 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6c6z"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.143779 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.144981 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd86b"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.146227 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fc4z8"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.147091 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.148146 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4nfs6"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.149139 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.150193 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wst2s"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.151197 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.152268 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5rnj6"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.153315 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hvd8t"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.154533 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nstnr"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.155658 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.156824 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.158123 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.158846 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.158942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9395e24-0d4b-4165-bf60-068876927f58-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-trusted-ca\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159099 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-serving-cert\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159387 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae87529c-e52f-45b0-9bbf-2a652e628bc5-metrics-tls\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159477 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clmnm\" (UniqueName: \"kubernetes.io/projected/9de83f0b-7dd2-4846-a1ce-c8af930778f4-kube-api-access-clmnm\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159620 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-serving-cert\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159701 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-config\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159570 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vjv4q"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159988 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-config\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160019 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mb9\" (UniqueName: \"kubernetes.io/projected/edb67e99-0d50-46e9-adbb-c0831dd915d8-kube-api-access-d9mb9\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160037 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-stats-auth\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160056 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw58z\" (UniqueName: \"kubernetes.io/projected/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-kube-api-access-nw58z\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.159769 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k97vh\" (UniqueName: \"kubernetes.io/projected/fe3df1e1-4c22-48df-aaea-469c864f0310-kube-api-access-k97vh\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160183 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e22b5c7-4191-4f21-82ba-3014ccc4e978-node-pullsecrets\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160211 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhsgq\" (UniqueName: \"kubernetes.io/projected/c3d58eba-4ddf-463c-baa1-1943fb60c732-kube-api-access-bhsgq\") pod \"downloads-7954f5f757-nstnr\" (UID: \"c3d58eba-4ddf-463c-baa1-1943fb60c732\") " pod="openshift-console/downloads-7954f5f757-nstnr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6e22b5c7-4191-4f21-82ba-3014ccc4e978-node-pullsecrets\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntrqw\" (UniqueName: \"kubernetes.io/projected/768c9343-1391-4221-b008-5dee2921953f-kube-api-access-ntrqw\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160256 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zb5h\" (UniqueName: \"kubernetes.io/projected/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-kube-api-access-2zb5h\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a49709c7-59e0-440e-89c2-177c42cd28e8-tmpfs\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160318 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zfpw\" (UniqueName: \"kubernetes.io/projected/e5224539-6d29-4bc3-9656-4665eb287e28-kube-api-access-7zfpw\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7xpn\" (UID: \"e5224539-6d29-4bc3-9656-4665eb287e28\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plx8q\" (UniqueName: \"kubernetes.io/projected/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-kube-api-access-plx8q\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160391 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-serving-cert\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160441 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160456 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68d61e65-8275-4862-9dae-a75029889b2a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160519 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/811f064c-ebf4-48ad-87a0-83205eb1eca5-metrics-tls\") pod \"dns-operator-744455d44c-vjv4q\" (UID: \"811f064c-ebf4-48ad-87a0-83205eb1eca5\") " pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160547 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a49709c7-59e0-440e-89c2-177c42cd28e8-webhook-cert\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160570 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pflfp\" (UniqueName: \"kubernetes.io/projected/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-kube-api-access-pflfp\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-serving-cert\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160736 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhtwp\" (UniqueName: \"kubernetes.io/projected/892d15b6-460e-4892-a836-0cc284c8a326-kube-api-access-mhtwp\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160786 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-serving-cert\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gr9p\" (UniqueName: \"kubernetes.io/projected/2dbc150e-3a25-4b44-b01c-effe99de5152-kube-api-access-5gr9p\") pod \"migrator-59844c95c7-7sfkh\" (UID: \"2dbc150e-3a25-4b44-b01c-effe99de5152\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160889 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-default-certificate\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160940 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fw6r\" (UniqueName: \"kubernetes.io/projected/fddde3da-8512-4e62-9c38-b59f98e117e0-kube-api-access-4fw6r\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.160989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/892d15b6-460e-4892-a836-0cc284c8a326-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161061 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-trusted-ca\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161041 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-audit-policies\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h4n4\" (UniqueName: \"kubernetes.io/projected/68d61e65-8275-4862-9dae-a75029889b2a-kube-api-access-8h4n4\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161152 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1017f8be-2192-46a4-8717-ead73ca5e81b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a9b5d9-308c-4971-a073-9c88de98d8ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79bbac6-f40a-4c92-8854-7ab5e72573cc-service-ca-bundle\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161251 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-srv-cert\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161305 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb67e99-0d50-46e9-adbb-c0831dd915d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161335 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-config\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161357 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918078e1-af38-475a-86d5-8179cafa18db-serving-cert\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161378 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fwg\" (UniqueName: \"kubernetes.io/projected/02b06614-31da-49d9-bc97-cf61b065d42f-kube-api-access-s5fwg\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161402 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3df1e1-4c22-48df-aaea-469c864f0310-config\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4ls\" (UniqueName: \"kubernetes.io/projected/10a9b5d9-308c-4971-a073-9c88de98d8ea-kube-api-access-td4ls\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161448 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-serving-cert\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161485 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161500 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-policies\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161516 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-etcd-serving-ca\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161533 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm9fk\" (UniqueName: \"kubernetes.io/projected/a5aa6958-e573-4efb-a031-218c62b0bec9-kube-api-access-wm9fk\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161559 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161588 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-mountpoint-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-encryption-config\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161648 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vd8k\" (UniqueName: \"kubernetes.io/projected/1d080332-c215-44c1-a027-65afbc612f88-kube-api-access-6vd8k\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1017f8be-2192-46a4-8717-ead73ca5e81b-config\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918078e1-af38-475a-86d5-8179cafa18db-config\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161757 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1017f8be-2192-46a4-8717-ead73ca5e81b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161780 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nbfq\" (UniqueName: \"kubernetes.io/projected/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-kube-api-access-7nbfq\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkw4\" (UniqueName: \"kubernetes.io/projected/8bb7828f-6d99-4539-8312-c8e96bfbc608-kube-api-access-lqkw4\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-trusted-ca-bundle\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161834 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-oauth-serving-cert\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161850 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm89b\" (UniqueName: \"kubernetes.io/projected/cd25c8a4-8047-4602-a95b-3308af65bd38-kube-api-access-gm89b\") pod \"auto-csr-approver-29567810-wb89g\" (UID: \"cd25c8a4-8047-4602-a95b-3308af65bd38\") " pod="openshift-infra/auto-csr-approver-29567810-wb89g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161870 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e22b5c7-4191-4f21-82ba-3014ccc4e978-audit-dir\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161890 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edb4245a-7971-4c40-81b6-27d56b319a2f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-k5dks\" (UID: \"edb4245a-7971-4c40-81b6-27d56b319a2f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161911 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqjf\" (UniqueName: \"kubernetes.io/projected/a79bbac6-f40a-4c92-8854-7ab5e72573cc-kube-api-access-nhqjf\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqpfp\" (UniqueName: \"kubernetes.io/projected/bc2b8485-8cc1-4029-8318-397d4278e455-kube-api-access-tqpfp\") pod \"package-server-manager-789f6589d5-mbgvc\" (UID: \"bc2b8485-8cc1-4029-8318-397d4278e455\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-config\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.161992 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162025 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/892d15b6-460e-4892-a836-0cc284c8a326-serving-cert\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162073 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c52b4\" (UniqueName: \"kubernetes.io/projected/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-kube-api-access-c52b4\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162165 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9395e24-0d4b-4165-bf60-068876927f58-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162250 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/892d15b6-460e-4892-a836-0cc284c8a326-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162316 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-audit-policies\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-service-ca\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10a9b5d9-308c-4971-a073-9c88de98d8ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162445 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-client\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162509 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5224539-6d29-4bc3-9656-4665eb287e28-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7xpn\" (UID: \"e5224539-6d29-4bc3-9656-4665eb287e28\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8bb7828f-6d99-4539-8312-c8e96bfbc608-audit-dir\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162664 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162701 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-plugins-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162792 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-service-ca-bundle\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-dir\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162888 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae87529c-e52f-45b0-9bbf-2a652e628bc5-config-volume\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162921 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9395e24-0d4b-4165-bf60-068876927f58-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.162983 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10a9b5d9-308c-4971-a073-9c88de98d8ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-machine-approver-tls\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163046 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163077 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-socket-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blnw\" (UniqueName: \"kubernetes.io/projected/a49709c7-59e0-440e-89c2-177c42cd28e8-kube-api-access-9blnw\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163254 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-client-ca\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-serving-cert\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbpw\" (UniqueName: \"kubernetes.io/projected/edb4245a-7971-4c40-81b6-27d56b319a2f-kube-api-access-dvbpw\") pod \"cluster-samples-operator-665b6dd947-k5dks\" (UID: \"edb4245a-7971-4c40-81b6-27d56b319a2f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163427 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-metrics-certs\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163429 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-config\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163655 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163684 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02b06614-31da-49d9-bc97-cf61b065d42f-images\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163732 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d080332-c215-44c1-a027-65afbc612f88-signing-key\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163759 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5aa6958-e573-4efb-a031-218c62b0bec9-secret-volume\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6fm\" (UniqueName: \"kubernetes.io/projected/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-kube-api-access-8m6fm\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163927 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a49709c7-59e0-440e-89c2-177c42cd28e8-apiservice-cert\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.163989 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe3df1e1-4c22-48df-aaea-469c864f0310-config\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164006 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-config\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d080332-c215-44c1-a027-65afbc612f88-signing-cabundle\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-etcd-client\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-ca\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164358 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164386 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7h98\" (UniqueName: \"kubernetes.io/projected/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-kube-api-access-r7h98\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164392 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-service-ca\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164406 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk47r\" (UniqueName: \"kubernetes.io/projected/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-kube-api-access-rk47r\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164425 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb67e99-0d50-46e9-adbb-c0831dd915d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164449 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc2b8485-8cc1-4029-8318-397d4278e455-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mbgvc\" (UID: \"bc2b8485-8cc1-4029-8318-397d4278e455\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164489 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-registration-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-encryption-config\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164559 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-image-import-ca\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164595 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-oauth-config\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164069 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.165444 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6e22b5c7-4191-4f21-82ba-3014ccc4e978-audit-dir\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.165410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.164560 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8bb7828f-6d99-4539-8312-c8e96bfbc608-audit-dir\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.166916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-config\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.166993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-audit\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.167076 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.167087 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe3df1e1-4c22-48df-aaea-469c864f0310-images\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.168942 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-dir\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.169009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-auth-proxy-config\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.169081 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/768c9343-1391-4221-b008-5dee2921953f-serving-cert\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.169108 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-service-ca\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.169156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r82zp\" (UniqueName: \"kubernetes.io/projected/918078e1-af38-475a-86d5-8179cafa18db-kube-api-access-r82zp\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.169327 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.169709 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-oauth-serving-cert\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.170335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.170347 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-image-import-ca\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.171058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/768c9343-1391-4221-b008-5dee2921953f-service-ca-bundle\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.171297 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.172143 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-etcd-client\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.172391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.173534 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/892d15b6-460e-4892-a836-0cc284c8a326-serving-cert\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.173576 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edb4245a-7971-4c40-81b6-27d56b319a2f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-k5dks\" (UID: \"edb4245a-7971-4c40-81b6-27d56b319a2f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.173755 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.174225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-serving-cert\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.174684 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-client-ca\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.174929 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.175739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-serving-cert\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.175807 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a9b5d9-308c-4971-a073-9c88de98d8ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.176101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10a9b5d9-308c-4971-a073-9c88de98d8ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.176598 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bb7828f-6d99-4539-8312-c8e96bfbc608-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.176653 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-config\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.176980 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-trusted-ca-bundle\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.177330 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.178738 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.178840 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-oauth-config\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.179000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-machine-approver-tls\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.179223 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-audit\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.179672 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-policies\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.179739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-serving-cert\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.179846 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-config\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.180058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-etcd-serving-ca\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.180098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-serving-cert\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.180291 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-config\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.180526 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-auth-proxy-config\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.180899 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.180918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fe3df1e1-4c22-48df-aaea-469c864f0310-images\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.169228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.181034 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.181153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m44cw\" (UniqueName: \"kubernetes.io/projected/dfbaac71-f99c-4373-a469-f2e5dd0ee632-kube-api-access-m44cw\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182078 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182226 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75ck2\" (UniqueName: \"kubernetes.io/projected/6e22b5c7-4191-4f21-82ba-3014ccc4e978-kube-api-access-75ck2\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td8g7\" (UniqueName: \"kubernetes.io/projected/ae87529c-e52f-45b0-9bbf-2a652e628bc5-kube-api-access-td8g7\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182697 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpvcx\" (UniqueName: \"kubernetes.io/projected/811f064c-ebf4-48ad-87a0-83205eb1eca5-kube-api-access-kpvcx\") pod \"dns-operator-744455d44c-vjv4q\" (UID: \"811f064c-ebf4-48ad-87a0-83205eb1eca5\") " pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182744 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-etcd-client\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182924 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-config\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.182944 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183027 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3df1e1-4c22-48df-aaea-469c864f0310-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183278 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02b06614-31da-49d9-bc97-cf61b065d42f-proxy-tls\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183292 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8bb7828f-6d99-4539-8312-c8e96bfbc608-encryption-config\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183342 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-csi-data-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183069 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/768c9343-1391-4221-b008-5dee2921953f-serving-cert\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-config\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183586 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68d61e65-8275-4862-9dae-a75029889b2a-proxy-tls\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183621 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02b06614-31da-49d9-bc97-cf61b065d42f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-client-ca\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.183798 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.184009 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aa6958-e573-4efb-a031-218c62b0bec9-config-volume\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.184053 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-config\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.184087 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-config\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.184831 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e22b5c7-4191-4f21-82ba-3014ccc4e978-config\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.185237 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-client-ca\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.185798 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rz6g5"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.186637 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-config\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.187831 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-encryption-config\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.188473 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3df1e1-4c22-48df-aaea-469c864f0310-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.188697 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.188835 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-serving-cert\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.190542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6e22b5c7-4191-4f21-82ba-3014ccc4e978-etcd-client\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.190603 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.190794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-serving-cert\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.192535 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wrzs4"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.194562 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2lnkp"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.196280 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.197439 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-wb89g"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.198533 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.199531 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.200177 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.200549 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.201627 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.202832 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6jnn5"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.203772 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.203885 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dnv7h"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.205646 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6jnn5"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.205798 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.206382 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.207611 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.208700 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hvfsp"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.209743 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zs5pr"] Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.220504 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.236927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"02f610dfd36fe9bba151d035cc95b083bc2a0a134823b67b5fe8dd4a65fb82dd"} Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.236974 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c99c97d573d15e53e325a18910ea38150805d0399c3dc90e8de6aa020d44a415"} Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.237165 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.238365 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1a8243a3daa4a9d245ddf7c678db9d2493ed45293514f40f1e2db64d4e750ffb"} Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.238410 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d34204adb9c00482355cdebd845da5821eccc5e49d6d52361ac87a4148502dfd"} Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.239870 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.260303 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.280332 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285211 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d080332-c215-44c1-a027-65afbc612f88-signing-cabundle\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285249 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-ca\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285282 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc2b8485-8cc1-4029-8318-397d4278e455-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mbgvc\" (UID: \"bc2b8485-8cc1-4029-8318-397d4278e455\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb67e99-0d50-46e9-adbb-c0831dd915d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-registration-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285337 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-service-ca\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285354 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m44cw\" (UniqueName: \"kubernetes.io/projected/dfbaac71-f99c-4373-a469-f2e5dd0ee632-kube-api-access-m44cw\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r82zp\" (UniqueName: \"kubernetes.io/projected/918078e1-af38-475a-86d5-8179cafa18db-kube-api-access-r82zp\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td8g7\" (UniqueName: \"kubernetes.io/projected/ae87529c-e52f-45b0-9bbf-2a652e628bc5-kube-api-access-td8g7\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285415 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpvcx\" (UniqueName: \"kubernetes.io/projected/811f064c-ebf4-48ad-87a0-83205eb1eca5-kube-api-access-kpvcx\") pod \"dns-operator-744455d44c-vjv4q\" (UID: \"811f064c-ebf4-48ad-87a0-83205eb1eca5\") " pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02b06614-31da-49d9-bc97-cf61b065d42f-proxy-tls\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285484 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68d61e65-8275-4862-9dae-a75029889b2a-proxy-tls\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02b06614-31da-49d9-bc97-cf61b065d42f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-csi-data-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285587 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285616 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aa6958-e573-4efb-a031-218c62b0bec9-config-volume\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285632 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9395e24-0d4b-4165-bf60-068876927f58-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285670 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-registration-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285699 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-csi-data-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-serving-cert\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285885 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clmnm\" (UniqueName: \"kubernetes.io/projected/9de83f0b-7dd2-4846-a1ce-c8af930778f4-kube-api-access-clmnm\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285912 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae87529c-e52f-45b0-9bbf-2a652e628bc5-metrics-tls\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285936 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-config\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285955 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285974 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mb9\" (UniqueName: \"kubernetes.io/projected/edb67e99-0d50-46e9-adbb-c0831dd915d8-kube-api-access-d9mb9\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.285997 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-stats-auth\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286014 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw58z\" (UniqueName: \"kubernetes.io/projected/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-kube-api-access-nw58z\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286035 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a49709c7-59e0-440e-89c2-177c42cd28e8-tmpfs\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zfpw\" (UniqueName: \"kubernetes.io/projected/e5224539-6d29-4bc3-9656-4665eb287e28-kube-api-access-7zfpw\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7xpn\" (UID: \"e5224539-6d29-4bc3-9656-4665eb287e28\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286146 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plx8q\" (UniqueName: \"kubernetes.io/projected/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-kube-api-access-plx8q\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a49709c7-59e0-440e-89c2-177c42cd28e8-webhook-cert\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286183 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68d61e65-8275-4862-9dae-a75029889b2a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286198 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/811f064c-ebf4-48ad-87a0-83205eb1eca5-metrics-tls\") pod \"dns-operator-744455d44c-vjv4q\" (UID: \"811f064c-ebf4-48ad-87a0-83205eb1eca5\") " pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gr9p\" (UniqueName: \"kubernetes.io/projected/2dbc150e-3a25-4b44-b01c-effe99de5152-kube-api-access-5gr9p\") pod \"migrator-59844c95c7-7sfkh\" (UID: \"2dbc150e-3a25-4b44-b01c-effe99de5152\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286246 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-default-certificate\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286270 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286286 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h4n4\" (UniqueName: \"kubernetes.io/projected/68d61e65-8275-4862-9dae-a75029889b2a-kube-api-access-8h4n4\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286306 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1017f8be-2192-46a4-8717-ead73ca5e81b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286323 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79bbac6-f40a-4c92-8854-7ab5e72573cc-service-ca-bundle\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-srv-cert\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286341 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/02b06614-31da-49d9-bc97-cf61b065d42f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286356 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb67e99-0d50-46e9-adbb-c0831dd915d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918078e1-af38-475a-86d5-8179cafa18db-serving-cert\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fwg\" (UniqueName: \"kubernetes.io/projected/02b06614-31da-49d9-bc97-cf61b065d42f-kube-api-access-s5fwg\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm9fk\" (UniqueName: \"kubernetes.io/projected/a5aa6958-e573-4efb-a031-218c62b0bec9-kube-api-access-wm9fk\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-mountpoint-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vd8k\" (UniqueName: \"kubernetes.io/projected/1d080332-c215-44c1-a027-65afbc612f88-kube-api-access-6vd8k\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286460 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1017f8be-2192-46a4-8717-ead73ca5e81b-config\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286476 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918078e1-af38-475a-86d5-8179cafa18db-config\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286505 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1017f8be-2192-46a4-8717-ead73ca5e81b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqjf\" (UniqueName: \"kubernetes.io/projected/a79bbac6-f40a-4c92-8854-7ab5e72573cc-kube-api-access-nhqjf\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm89b\" (UniqueName: \"kubernetes.io/projected/cd25c8a4-8047-4602-a95b-3308af65bd38-kube-api-access-gm89b\") pod \"auto-csr-approver-29567810-wb89g\" (UID: \"cd25c8a4-8047-4602-a95b-3308af65bd38\") " pod="openshift-infra/auto-csr-approver-29567810-wb89g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286579 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqpfp\" (UniqueName: \"kubernetes.io/projected/bc2b8485-8cc1-4029-8318-397d4278e455-kube-api-access-tqpfp\") pod \"package-server-manager-789f6589d5-mbgvc\" (UID: \"bc2b8485-8cc1-4029-8318-397d4278e455\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286597 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c52b4\" (UniqueName: \"kubernetes.io/projected/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-kube-api-access-c52b4\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9395e24-0d4b-4165-bf60-068876927f58-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286636 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-client\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286654 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5224539-6d29-4bc3-9656-4665eb287e28-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7xpn\" (UID: \"e5224539-6d29-4bc3-9656-4665eb287e28\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286686 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286725 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-plugins-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae87529c-e52f-45b0-9bbf-2a652e628bc5-config-volume\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-mountpoint-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286948 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a49709c7-59e0-440e-89c2-177c42cd28e8-tmpfs\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287259 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/68d61e65-8275-4862-9dae-a75029889b2a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9395e24-0d4b-4165-bf60-068876927f58-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blnw\" (UniqueName: \"kubernetes.io/projected/a49709c7-59e0-440e-89c2-177c42cd28e8-kube-api-access-9blnw\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287459 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-socket-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287479 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-metrics-certs\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287416 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-plugins-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.286749 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-config\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02b06614-31da-49d9-bc97-cf61b065d42f-images\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287598 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9de83f0b-7dd2-4846-a1ce-c8af930778f4-socket-dir\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d080332-c215-44c1-a027-65afbc612f88-signing-key\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287638 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5aa6958-e573-4efb-a031-218c62b0bec9-secret-volume\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a49709c7-59e0-440e-89c2-177c42cd28e8-apiservice-cert\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.287681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-config\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.289259 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68d61e65-8275-4862-9dae-a75029889b2a-proxy-tls\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.289865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.290590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5224539-6d29-4bc3-9656-4665eb287e28-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7xpn\" (UID: \"e5224539-6d29-4bc3-9656-4665eb287e28\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.290928 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.291572 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5aa6958-e573-4efb-a031-218c62b0bec9-secret-volume\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.300019 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.319667 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.339961 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.349532 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.360155 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.380694 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.401388 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.420603 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.428066 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.441396 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.450520 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc2b8485-8cc1-4029-8318-397d4278e455-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mbgvc\" (UID: \"bc2b8485-8cc1-4029-8318-397d4278e455\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.461220 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.473765 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-srv-cert\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.480303 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.501854 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.520706 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.531622 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d080332-c215-44c1-a027-65afbc612f88-signing-key\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.540538 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.546930 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d080332-c215-44c1-a027-65afbc612f88-signing-cabundle\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.560653 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.599827 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.619957 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.639928 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.660708 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.669900 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.686111 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.687918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.701391 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.720076 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.728468 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-config\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.739897 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.759725 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.769015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-serving-cert\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.779667 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.788524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-ca\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.800014 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.807177 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-service-ca\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.819596 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.835560 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-etcd-client\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.840320 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.860308 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.880789 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.900101 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.910018 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/811f064c-ebf4-48ad-87a0-83205eb1eca5-metrics-tls\") pod \"dns-operator-744455d44c-vjv4q\" (UID: \"811f064c-ebf4-48ad-87a0-83205eb1eca5\") " pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.919787 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.939700 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.960174 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.980942 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:51:01 crc kubenswrapper[4775]: I0321 04:51:01.992204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/918078e1-af38-475a-86d5-8179cafa18db-serving-cert\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.000158 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.008890 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/918078e1-af38-475a-86d5-8179cafa18db-config\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.020395 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.040753 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.063882 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.080186 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.089300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/02b06614-31da-49d9-bc97-cf61b065d42f-images\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.100427 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.108738 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02b06614-31da-49d9-bc97-cf61b065d42f-proxy-tls\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.118317 4775 request.go:700] Waited for 1.001811711s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-operator-dockercfg-98p87&limit=500&resourceVersion=0 Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.120237 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.140947 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.150346 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a49709c7-59e0-440e-89c2-177c42cd28e8-webhook-cert\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.150689 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a49709c7-59e0-440e-89c2-177c42cd28e8-apiservice-cert\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.160580 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.179993 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.187077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aa6958-e573-4efb-a031-218c62b0bec9-config-volume\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.200055 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.220453 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.231320 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-default-certificate\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.240841 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.250370 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-stats-auth\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.260337 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.270197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a79bbac6-f40a-4c92-8854-7ab5e72573cc-metrics-certs\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.280186 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.286217 4775 configmap.go:193] Couldn't get configMap openshift-controller-manager-operator/openshift-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.286285 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/edb67e99-0d50-46e9-adbb-c0831dd915d8-config podName:edb67e99-0d50-46e9-adbb-c0831dd915d8 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.786269263 +0000 UTC m=+215.762732887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/edb67e99-0d50-46e9-adbb-c0831dd915d8-config") pod "openshift-controller-manager-operator-756b6f6bc6-xv7nw" (UID: "edb67e99-0d50-46e9-adbb-c0831dd915d8") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.286218 4775 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.286329 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae87529c-e52f-45b0-9bbf-2a652e628bc5-metrics-tls podName:ae87529c-e52f-45b0-9bbf-2a652e628bc5 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.786319765 +0000 UTC m=+215.762783389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ae87529c-e52f-45b0-9bbf-2a652e628bc5-metrics-tls") pod "dns-default-zs5pr" (UID: "ae87529c-e52f-45b0-9bbf-2a652e628bc5") : failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287330 4775 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287406 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a79bbac6-f40a-4c92-8854-7ab5e72573cc-service-ca-bundle podName:a79bbac6-f40a-4c92-8854-7ab5e72573cc nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.787392725 +0000 UTC m=+215.763856359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a79bbac6-f40a-4c92-8854-7ab5e72573cc-service-ca-bundle") pod "router-default-5444994796-2z46g" (UID: "a79bbac6-f40a-4c92-8854-7ab5e72573cc") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287429 4775 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287455 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9395e24-0d4b-4165-bf60-068876927f58-serving-cert podName:f9395e24-0d4b-4165-bf60-068876927f58 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.787447307 +0000 UTC m=+215.763911051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f9395e24-0d4b-4165-bf60-068876927f58-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" (UID: "f9395e24-0d4b-4165-bf60-068876927f58") : failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287500 4775 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287546 4775 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287566 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae87529c-e52f-45b0-9bbf-2a652e628bc5-config-volume podName:ae87529c-e52f-45b0-9bbf-2a652e628bc5 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.78755678 +0000 UTC m=+215.764020404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/ae87529c-e52f-45b0-9bbf-2a652e628bc5-config-volume") pod "dns-default-zs5pr" (UID: "ae87529c-e52f-45b0-9bbf-2a652e628bc5") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287669 4775 secret.go:188] Couldn't get secret openshift-kube-apiserver-operator/kube-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287699 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f9395e24-0d4b-4165-bf60-068876927f58-config podName:f9395e24-0d4b-4165-bf60-068876927f58 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.787678614 +0000 UTC m=+215.764142238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f9395e24-0d4b-4165-bf60-068876927f58-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" (UID: "f9395e24-0d4b-4165-bf60-068876927f58") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287582 4775 secret.go:188] Couldn't get secret openshift-controller-manager-operator/openshift-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287716 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1017f8be-2192-46a4-8717-ead73ca5e81b-serving-cert podName:1017f8be-2192-46a4-8717-ead73ca5e81b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.787708925 +0000 UTC m=+215.764172549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1017f8be-2192-46a4-8717-ead73ca5e81b-serving-cert") pod "kube-apiserver-operator-766d6c64bb-2zh8p" (UID: "1017f8be-2192-46a4-8717-ead73ca5e81b") : failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287643 4775 configmap.go:193] Couldn't get configMap openshift-kube-apiserver-operator/kube-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287733 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edb67e99-0d50-46e9-adbb-c0831dd915d8-serving-cert podName:edb67e99-0d50-46e9-adbb-c0831dd915d8 nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.787724085 +0000 UTC m=+215.764187829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/edb67e99-0d50-46e9-adbb-c0831dd915d8-serving-cert") pod "openshift-controller-manager-operator-756b6f6bc6-xv7nw" (UID: "edb67e99-0d50-46e9-adbb-c0831dd915d8") : failed to sync secret cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: E0321 04:51:02.287768 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1017f8be-2192-46a4-8717-ead73ca5e81b-config podName:1017f8be-2192-46a4-8717-ead73ca5e81b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:02.787761176 +0000 UTC m=+215.764224800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1017f8be-2192-46a4-8717-ead73ca5e81b-config") pod "kube-apiserver-operator-766d6c64bb-2zh8p" (UID: "1017f8be-2192-46a4-8717-ead73ca5e81b") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.300297 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.320419 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.340106 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.360806 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.381023 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.400603 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.420541 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.440559 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.459878 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.480954 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.482441 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.482524 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.500234 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.520609 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.540703 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.560761 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.580502 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.606577 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.620034 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.640104 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.661142 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.681430 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.701103 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.720207 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.740854 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.760925 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.782913 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.802009 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.809617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1017f8be-2192-46a4-8717-ead73ca5e81b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.809691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79bbac6-f40a-4c92-8854-7ab5e72573cc-service-ca-bundle\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.809738 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb67e99-0d50-46e9-adbb-c0831dd915d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.809841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1017f8be-2192-46a4-8717-ead73ca5e81b-config\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.810781 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9395e24-0d4b-4165-bf60-068876927f58-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.810831 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae87529c-e52f-45b0-9bbf-2a652e628bc5-config-volume\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.810851 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9395e24-0d4b-4165-bf60-068876927f58-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.810917 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb67e99-0d50-46e9-adbb-c0831dd915d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.811281 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79bbac6-f40a-4c92-8854-7ab5e72573cc-service-ca-bundle\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.811381 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae87529c-e52f-45b0-9bbf-2a652e628bc5-metrics-tls\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.812021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1017f8be-2192-46a4-8717-ead73ca5e81b-config\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.812344 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae87529c-e52f-45b0-9bbf-2a652e628bc5-config-volume\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.812349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9395e24-0d4b-4165-bf60-068876927f58-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.812617 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edb67e99-0d50-46e9-adbb-c0831dd915d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.815088 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edb67e99-0d50-46e9-adbb-c0831dd915d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.815276 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9395e24-0d4b-4165-bf60-068876927f58-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.815456 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae87529c-e52f-45b0-9bbf-2a652e628bc5-metrics-tls\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.817771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1017f8be-2192-46a4-8717-ead73ca5e81b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.857720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k97vh\" (UniqueName: \"kubernetes.io/projected/fe3df1e1-4c22-48df-aaea-469c864f0310-kube-api-access-k97vh\") pod \"machine-api-operator-5694c8668f-5rnj6\" (UID: \"fe3df1e1-4c22-48df-aaea-469c864f0310\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.858974 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.883775 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhsgq\" (UniqueName: \"kubernetes.io/projected/c3d58eba-4ddf-463c-baa1-1943fb60c732-kube-api-access-bhsgq\") pod \"downloads-7954f5f757-nstnr\" (UID: \"c3d58eba-4ddf-463c-baa1-1943fb60c732\") " pod="openshift-console/downloads-7954f5f757-nstnr" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.896745 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zb5h\" (UniqueName: \"kubernetes.io/projected/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-kube-api-access-2zb5h\") pod \"controller-manager-879f6c89f-r6c6z\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.914010 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntrqw\" (UniqueName: \"kubernetes.io/projected/768c9343-1391-4221-b008-5dee2921953f-kube-api-access-ntrqw\") pod \"authentication-operator-69f744f599-dzxd5\" (UID: \"768c9343-1391-4221-b008-5dee2921953f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.934415 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pflfp\" (UniqueName: \"kubernetes.io/projected/6343dd4f-5c5b-4c94-a4d6-f603698ba6ac-kube-api-access-pflfp\") pod \"machine-approver-56656f9798-468g4\" (UID: \"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.942668 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.966777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhtwp\" (UniqueName: \"kubernetes.io/projected/892d15b6-460e-4892-a836-0cc284c8a326-kube-api-access-mhtwp\") pod \"openshift-config-operator-7777fb866f-mthhs\" (UID: \"892d15b6-460e-4892-a836-0cc284c8a326\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.975480 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fw6r\" (UniqueName: \"kubernetes.io/projected/fddde3da-8512-4e62-9c38-b59f98e117e0-kube-api-access-4fw6r\") pod \"oauth-openshift-558db77b4-jd86b\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.989612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" Mar 21 04:51:02 crc kubenswrapper[4775]: I0321 04:51:02.994447 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10a9b5d9-308c-4971-a073-9c88de98d8ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.022678 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nstnr" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.032472 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.053444 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbpw\" (UniqueName: \"kubernetes.io/projected/edb4245a-7971-4c40-81b6-27d56b319a2f-kube-api-access-dvbpw\") pod \"cluster-samples-operator-665b6dd947-k5dks\" (UID: \"edb4245a-7971-4c40-81b6-27d56b319a2f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.062799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkw4\" (UniqueName: \"kubernetes.io/projected/8bb7828f-6d99-4539-8312-c8e96bfbc608-kube-api-access-lqkw4\") pod \"apiserver-7bbb656c7d-kxtlf\" (UID: \"8bb7828f-6d99-4539-8312-c8e96bfbc608\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.100901 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nbfq\" (UniqueName: \"kubernetes.io/projected/2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a-kube-api-access-7nbfq\") pod \"openshift-apiserver-operator-796bbdcf4f-5pdcr\" (UID: \"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.101503 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6fm\" (UniqueName: \"kubernetes.io/projected/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-kube-api-access-8m6fm\") pod \"console-f9d7485db-wst2s\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.111986 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk47r\" (UniqueName: \"kubernetes.io/projected/12b1cc27-6a60-43d2-9d5e-eb7a54c1e899-kube-api-access-rk47r\") pod \"console-operator-58897d9998-fc4z8\" (UID: \"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899\") " pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.118623 4775 request.go:700] Waited for 1.938755618s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.126681 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7h98\" (UniqueName: \"kubernetes.io/projected/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-kube-api-access-r7h98\") pod \"route-controller-manager-6576b87f9c-55lcp\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.138579 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4ls\" (UniqueName: \"kubernetes.io/projected/10a9b5d9-308c-4971-a073-9c88de98d8ea-kube-api-access-td4ls\") pod \"cluster-image-registry-operator-dc59b4c8b-9v866\" (UID: \"10a9b5d9-308c-4971-a073-9c88de98d8ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.153731 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6c6z"] Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.156779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.159546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.160869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75ck2\" (UniqueName: \"kubernetes.io/projected/6e22b5c7-4191-4f21-82ba-3014ccc4e978-kube-api-access-75ck2\") pod \"apiserver-76f77b778f-rz6g5\" (UID: \"6e22b5c7-4191-4f21-82ba-3014ccc4e978\") " pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.166930 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.181364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.196393 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.203485 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.210106 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.223337 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.238182 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.239023 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5rnj6"] Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.240196 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.249445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" event={"ID":"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac","Type":"ContainerStarted","Data":"17aca1ec1c735ebefc774f0f3540a58335b82734921e82f6cf0312e200950325"} Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.250172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" event={"ID":"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb","Type":"ContainerStarted","Data":"3aa7cdf6ad840aefd2de6cc5405b71b516de12c087774e2fad897251db2fc6ea"} Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.258172 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.259388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.259849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.277804 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.284304 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.298189 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.319406 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m44cw\" (UniqueName: \"kubernetes.io/projected/dfbaac71-f99c-4373-a469-f2e5dd0ee632-kube-api-access-m44cw\") pod \"marketplace-operator-79b997595-wrzs4\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.320330 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.327402 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.338717 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td8g7\" (UniqueName: \"kubernetes.io/projected/ae87529c-e52f-45b0-9bbf-2a652e628bc5-kube-api-access-td8g7\") pod \"dns-default-zs5pr\" (UID: \"ae87529c-e52f-45b0-9bbf-2a652e628bc5\") " pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.362457 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r82zp\" (UniqueName: \"kubernetes.io/projected/918078e1-af38-475a-86d5-8179cafa18db-kube-api-access-r82zp\") pod \"service-ca-operator-777779d784-wqjnj\" (UID: \"918078e1-af38-475a-86d5-8179cafa18db\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.388276 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpvcx\" (UniqueName: \"kubernetes.io/projected/811f064c-ebf4-48ad-87a0-83205eb1eca5-kube-api-access-kpvcx\") pod \"dns-operator-744455d44c-vjv4q\" (UID: \"811f064c-ebf4-48ad-87a0-83205eb1eca5\") " pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.397929 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9395e24-0d4b-4165-bf60-068876927f58-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5ztsd\" (UID: \"f9395e24-0d4b-4165-bf60-068876927f58\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.418939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clmnm\" (UniqueName: \"kubernetes.io/projected/9de83f0b-7dd2-4846-a1ce-c8af930778f4-kube-api-access-clmnm\") pod \"csi-hostpathplugin-hvfsp\" (UID: \"9de83f0b-7dd2-4846-a1ce-c8af930778f4\") " pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.433287 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.439269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mb9\" (UniqueName: \"kubernetes.io/projected/edb67e99-0d50-46e9-adbb-c0831dd915d8-kube-api-access-d9mb9\") pod \"openshift-controller-manager-operator-756b6f6bc6-xv7nw\" (UID: \"edb67e99-0d50-46e9-adbb-c0831dd915d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.454895 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.456322 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.457406 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw58z\" (UniqueName: \"kubernetes.io/projected/07720715-40fb-4f7d-8f6b-381dc7cf9ea2-kube-api-access-nw58z\") pod \"etcd-operator-b45778765-4nfs6\" (UID: \"07720715-40fb-4f7d-8f6b-381dc7cf9ea2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.501528 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plx8q\" (UniqueName: \"kubernetes.io/projected/b0b465ff-f8f3-4a99-9235-1ddc4ce093e4-kube-api-access-plx8q\") pod \"olm-operator-6b444d44fb-v296h\" (UID: \"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.521979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zfpw\" (UniqueName: \"kubernetes.io/projected/e5224539-6d29-4bc3-9656-4665eb287e28-kube-api-access-7zfpw\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7xpn\" (UID: \"e5224539-6d29-4bc3-9656-4665eb287e28\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.528244 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.540402 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.545200 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nstnr"] Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.549891 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fwg\" (UniqueName: \"kubernetes.io/projected/02b06614-31da-49d9-bc97-cf61b065d42f-kube-api-access-s5fwg\") pod \"machine-config-operator-74547568cd-8kpqj\" (UID: \"02b06614-31da-49d9-bc97-cf61b065d42f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.551590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gr9p\" (UniqueName: \"kubernetes.io/projected/2dbc150e-3a25-4b44-b01c-effe99de5152-kube-api-access-5gr9p\") pod \"migrator-59844c95c7-7sfkh\" (UID: \"2dbc150e-3a25-4b44-b01c-effe99de5152\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.556462 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.566412 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.567949 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h4n4\" (UniqueName: \"kubernetes.io/projected/68d61e65-8275-4862-9dae-a75029889b2a-kube-api-access-8h4n4\") pod \"machine-config-controller-84d6567774-57cxh\" (UID: \"68d61e65-8275-4862-9dae-a75029889b2a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.589195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm9fk\" (UniqueName: \"kubernetes.io/projected/a5aa6958-e573-4efb-a031-218c62b0bec9-kube-api-access-wm9fk\") pod \"collect-profiles-29567805-97bct\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.602957 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqpfp\" (UniqueName: \"kubernetes.io/projected/bc2b8485-8cc1-4029-8318-397d4278e455-kube-api-access-tqpfp\") pod \"package-server-manager-789f6589d5-mbgvc\" (UID: \"bc2b8485-8cc1-4029-8318-397d4278e455\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.619964 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c52b4\" (UniqueName: \"kubernetes.io/projected/0c0dc3b9-8e11-4ff0-8f26-549ce47215f0-kube-api-access-c52b4\") pod \"kube-storage-version-migrator-operator-b67b599dd-blwmr\" (UID: \"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.640365 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.643138 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vd8k\" (UniqueName: \"kubernetes.io/projected/1d080332-c215-44c1-a027-65afbc612f88-kube-api-access-6vd8k\") pod \"service-ca-9c57cc56f-gd4gc\" (UID: \"1d080332-c215-44c1-a027-65afbc612f88\") " pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.664137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eddac2a3-8b07-410d-9e9f-f7ce0c05abf5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-725jh\" (UID: \"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.664296 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.678938 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.679379 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.687818 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1017f8be-2192-46a4-8717-ead73ca5e81b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2zh8p\" (UID: \"1017f8be-2192-46a4-8717-ead73ca5e81b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.697533 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.705376 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.712354 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.712557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqjf\" (UniqueName: \"kubernetes.io/projected/a79bbac6-f40a-4c92-8854-7ab5e72573cc-kube-api-access-nhqjf\") pod \"router-default-5444994796-2z46g\" (UID: \"a79bbac6-f40a-4c92-8854-7ab5e72573cc\") " pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.714619 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm89b\" (UniqueName: \"kubernetes.io/projected/cd25c8a4-8047-4602-a95b-3308af65bd38-kube-api-access-gm89b\") pod \"auto-csr-approver-29567810-wb89g\" (UID: \"cd25c8a4-8047-4602-a95b-3308af65bd38\") " pod="openshift-infra/auto-csr-approver-29567810-wb89g" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.724794 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.737820 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.747565 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blnw\" (UniqueName: \"kubernetes.io/projected/a49709c7-59e0-440e-89c2-177c42cd28e8-kube-api-access-9blnw\") pod \"packageserver-d55dfcdfc-sp69c\" (UID: \"a49709c7-59e0-440e-89c2-177c42cd28e8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.761148 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf"] Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.777746 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-wb89g" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.790278 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dzxd5"] Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.809395 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.809616 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.809906 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.813524 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.819437 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-tls\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d55e67e9-9980-4878-9875-a07207894f6f-profile-collector-cert\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-bound-sa-token\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835861 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdww\" (UniqueName: \"kubernetes.io/projected/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-kube-api-access-bqdww\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835896 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835926 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2j66\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-kube-api-access-z2j66\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835955 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.835987 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836006 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b0a24bc9-b395-4732-ba57-b096ee9ffdb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2lnkp\" (UID: \"b0a24bc9-b395-4732-ba57-b096ee9ffdb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836034 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836082 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d55e67e9-9980-4878-9875-a07207894f6f-srv-cert\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836135 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7kh\" (UniqueName: \"kubernetes.io/projected/d55e67e9-9980-4878-9875-a07207894f6f-kube-api-access-hv7kh\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836182 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhcjp\" (UniqueName: \"kubernetes.io/projected/b0a24bc9-b395-4732-ba57-b096ee9ffdb1-kube-api-access-hhcjp\") pod \"multus-admission-controller-857f4d67dd-2lnkp\" (UID: \"b0a24bc9-b395-4732-ba57-b096ee9ffdb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-certificates\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-trusted-ca\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.836272 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: E0321 04:51:03.839652 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.339637433 +0000 UTC m=+217.316101057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:03 crc kubenswrapper[4775]: W0321 04:51:03.896796 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79bbac6_f40a_4c92_8854_7ab5e72573cc.slice/crio-20eb94421e0eea64be64321e6c00b000dee79fd4a4662f6a369022dd15653f7e WatchSource:0}: Error finding container 20eb94421e0eea64be64321e6c00b000dee79fd4a4662f6a369022dd15653f7e: Status 404 returned error can't find the container with id 20eb94421e0eea64be64321e6c00b000dee79fd4a4662f6a369022dd15653f7e Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.937734 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938347 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-tls\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938366 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d55e67e9-9980-4878-9875-a07207894f6f-profile-collector-cert\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-bound-sa-token\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938470 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdww\" (UniqueName: \"kubernetes.io/projected/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-kube-api-access-bqdww\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecae58ef-adaa-42ea-9246-49845334a819-node-bootstrap-token\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938552 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c6eec5-7117-48de-a4c5-640517da6a76-cert\") pod \"ingress-canary-6jnn5\" (UID: \"35c6eec5-7117-48de-a4c5-640517da6a76\") " pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2j66\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-kube-api-access-z2j66\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938772 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b0a24bc9-b395-4732-ba57-b096ee9ffdb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2lnkp\" (UID: \"b0a24bc9-b395-4732-ba57-b096ee9ffdb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.938927 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d55e67e9-9980-4878-9875-a07207894f6f-srv-cert\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939031 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7kh\" (UniqueName: \"kubernetes.io/projected/d55e67e9-9980-4878-9875-a07207894f6f-kube-api-access-hv7kh\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhcjp\" (UniqueName: \"kubernetes.io/projected/b0a24bc9-b395-4732-ba57-b096ee9ffdb1-kube-api-access-hhcjp\") pod \"multus-admission-controller-857f4d67dd-2lnkp\" (UID: \"b0a24bc9-b395-4732-ba57-b096ee9ffdb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939205 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-certificates\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939241 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-trusted-ca\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939265 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbgjp\" (UniqueName: \"kubernetes.io/projected/ecae58ef-adaa-42ea-9246-49845334a819-kube-api-access-xbgjp\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939331 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecae58ef-adaa-42ea-9246-49845334a819-certs\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.939369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpwh\" (UniqueName: \"kubernetes.io/projected/35c6eec5-7117-48de-a4c5-640517da6a76-kube-api-access-5bpwh\") pod \"ingress-canary-6jnn5\" (UID: \"35c6eec5-7117-48de-a4c5-640517da6a76\") " pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:03 crc kubenswrapper[4775]: E0321 04:51:03.945076 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.441353055 +0000 UTC m=+217.417816689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.954074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.954343 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.955175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.955544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b0a24bc9-b395-4732-ba57-b096ee9ffdb1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2lnkp\" (UID: \"b0a24bc9-b395-4732-ba57-b096ee9ffdb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.965800 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-trusted-ca\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.970405 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdww\" (UniqueName: \"kubernetes.io/projected/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-kube-api-access-bqdww\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.972636 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d55e67e9-9980-4878-9875-a07207894f6f-srv-cert\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.975419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-tls\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.976193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-bound-sa-token\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.976840 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-certificates\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:03 crc kubenswrapper[4775]: I0321 04:51:03.977889 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d55e67e9-9980-4878-9875-a07207894f6f-profile-collector-cert\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.008399 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.040413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2j66\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-kube-api-access-z2j66\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.042075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecae58ef-adaa-42ea-9246-49845334a819-node-bootstrap-token\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.042110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c6eec5-7117-48de-a4c5-640517da6a76-cert\") pod \"ingress-canary-6jnn5\" (UID: \"35c6eec5-7117-48de-a4c5-640517da6a76\") " pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.042166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.042224 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbgjp\" (UniqueName: \"kubernetes.io/projected/ecae58ef-adaa-42ea-9246-49845334a819-kube-api-access-xbgjp\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.042245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecae58ef-adaa-42ea-9246-49845334a819-certs\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.042262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpwh\" (UniqueName: \"kubernetes.io/projected/35c6eec5-7117-48de-a4c5-640517da6a76-kube-api-access-5bpwh\") pod \"ingress-canary-6jnn5\" (UID: \"35c6eec5-7117-48de-a4c5-640517da6a76\") " pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.042625 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.542614213 +0000 UTC m=+217.519077837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.049090 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c6eec5-7117-48de-a4c5-640517da6a76-cert\") pod \"ingress-canary-6jnn5\" (UID: \"35c6eec5-7117-48de-a4c5-640517da6a76\") " pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.049506 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecae58ef-adaa-42ea-9246-49845334a819-node-bootstrap-token\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.051981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7kh\" (UniqueName: \"kubernetes.io/projected/d55e67e9-9980-4878-9875-a07207894f6f-kube-api-access-hv7kh\") pod \"catalog-operator-68c6474976-zkl7p\" (UID: \"d55e67e9-9980-4878-9875-a07207894f6f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.054515 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecae58ef-adaa-42ea-9246-49845334a819-certs\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.059812 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhcjp\" (UniqueName: \"kubernetes.io/projected/b0a24bc9-b395-4732-ba57-b096ee9ffdb1-kube-api-access-hhcjp\") pod \"multus-admission-controller-857f4d67dd-2lnkp\" (UID: \"b0a24bc9-b395-4732-ba57-b096ee9ffdb1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.089707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vsh8z\" (UID: \"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.119823 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbgjp\" (UniqueName: \"kubernetes.io/projected/ecae58ef-adaa-42ea-9246-49845334a819-kube-api-access-xbgjp\") pod \"machine-config-server-dnv7h\" (UID: \"ecae58ef-adaa-42ea-9246-49845334a819\") " pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.145617 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.146138 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.646100044 +0000 UTC m=+217.622563668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.147865 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.159546 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpwh\" (UniqueName: \"kubernetes.io/projected/35c6eec5-7117-48de-a4c5-640517da6a76-kube-api-access-5bpwh\") pod \"ingress-canary-6jnn5\" (UID: \"35c6eec5-7117-48de-a4c5-640517da6a76\") " pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.191705 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6jnn5" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.201682 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dnv7h" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.248878 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.249171 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.749158473 +0000 UTC m=+217.725622097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.256823 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.263884 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.290913 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fc4z8"] Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.292666 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nstnr" event={"ID":"c3d58eba-4ddf-463c-baa1-1943fb60c732","Type":"ContainerStarted","Data":"591e37f237061eb3eb2e78ab869ffea4d983607a257bac29e4fb969c2fcd9940"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.292707 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nstnr" event={"ID":"c3d58eba-4ddf-463c-baa1-1943fb60c732","Type":"ContainerStarted","Data":"5523732cc967700d7fbbdac853665f17cf7feebe5b7b0a8fb0f722ef5753d557"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.292992 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nstnr" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.294082 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-nstnr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.294198 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nstnr" podUID="c3d58eba-4ddf-463c-baa1-1943fb60c732" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.297680 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" event={"ID":"fe3df1e1-4c22-48df-aaea-469c864f0310","Type":"ContainerStarted","Data":"9544447e99441f87401ed20c2711c92e4aea86d65db8c4871e6c026e52807c41"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.297963 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" event={"ID":"fe3df1e1-4c22-48df-aaea-469c864f0310","Type":"ContainerStarted","Data":"d9f6011b3900db1117e47de329dbbfd8cf7a8f07048ae0cad95b86eb83444093"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.297978 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" event={"ID":"fe3df1e1-4c22-48df-aaea-469c864f0310","Type":"ContainerStarted","Data":"b7eff7cc15d1d879cb238a4a50f9e6b0725bb469b6c1561f9ac5294d7c2bc0c6"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.303231 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" event={"ID":"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac","Type":"ContainerStarted","Data":"dec4e9479380b7e735d9738838d634ee2790d7b2a87d040b08f5a5fd61367c2e"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.304520 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr"] Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.313381 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" event={"ID":"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb","Type":"ContainerStarted","Data":"2f124547d6c6b6e4cadb63b6c1732b81d7643f21b62f22762426569eabbfc7c4"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.315514 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.319572 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rz6g5"] Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.319624 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mthhs"] Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.320575 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2z46g" event={"ID":"a79bbac6-f40a-4c92-8854-7ab5e72573cc","Type":"ContainerStarted","Data":"20eb94421e0eea64be64321e6c00b000dee79fd4a4662f6a369022dd15653f7e"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.324887 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" event={"ID":"768c9343-1391-4221-b008-5dee2921953f","Type":"ContainerStarted","Data":"b4f28a77963b985e8d55e24993d011cf12ef9fe6081a370032418a9d8ea0ffda"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.325582 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-r6c6z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.325618 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" podUID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.329001 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" event={"ID":"8bb7828f-6d99-4539-8312-c8e96bfbc608","Type":"ContainerStarted","Data":"ac8f4456eebf6c010d00843db2a2f64d8e826e8bf72978a78df02bc12cb5b96f"} Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.349772 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.349986 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.849961108 +0000 UTC m=+217.826424732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.350194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.351371 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.851363599 +0000 UTC m=+217.827827223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.453464 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.454650 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:04.954633034 +0000 UTC m=+217.931096668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: W0321 04:51:04.484652 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b1cc27_6a60_43d2_9d5e_eb7a54c1e899.slice/crio-a0ddf7f265c9468b0e817c2b6ad7ddc35b80f659e965df1ca89854dfd66fbfcb WatchSource:0}: Error finding container a0ddf7f265c9468b0e817c2b6ad7ddc35b80f659e965df1ca89854dfd66fbfcb: Status 404 returned error can't find the container with id a0ddf7f265c9468b0e817c2b6ad7ddc35b80f659e965df1ca89854dfd66fbfcb Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.554497 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.554817 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.054805981 +0000 UTC m=+218.031269605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.696108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.696366 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.196344146 +0000 UTC m=+218.172807770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.696888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.697239 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.197224371 +0000 UTC m=+218.173688005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.804812 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.805344 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.305322896 +0000 UTC m=+218.281786520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:04 crc kubenswrapper[4775]: I0321 04:51:04.905855 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:04 crc kubenswrapper[4775]: E0321 04:51:04.906229 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.406216823 +0000 UTC m=+218.382680447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:04.997592 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nstnr" podStartSLOduration=158.997572017 podStartE2EDuration="2m38.997572017s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:04.971509219 +0000 UTC m=+217.947972843" watchObservedRunningTime="2026-03-21 04:51:04.997572017 +0000 UTC m=+217.974035641" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:04.997814 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.001170 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd86b"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.006632 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.007318 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.507298986 +0000 UTC m=+218.483762610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.012265 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp"] Mar 21 04:51:05 crc kubenswrapper[4775]: W0321 04:51:05.017328 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded9baf4c_70b2_450f_9b21_76dfafbc44d0.slice/crio-1dc3c475bbed8c4530a54779f813884a49b5a06477a18a344d8eb396adfd545f WatchSource:0}: Error finding container 1dc3c475bbed8c4530a54779f813884a49b5a06477a18a344d8eb396adfd545f: Status 404 returned error can't find the container with id 1dc3c475bbed8c4530a54779f813884a49b5a06477a18a344d8eb396adfd545f Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.020183 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wst2s"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.023740 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.040780 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.047546 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vjv4q"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.108316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.108759 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.60874371 +0000 UTC m=+218.585207334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.180808 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.210241 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.210633 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.710613065 +0000 UTC m=+218.687076689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.215920 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.218595 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.227495 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.235437 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.244632 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.274614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-wb89g"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.274667 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zs5pr"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.278993 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.283631 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gd4gc"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.287061 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wrzs4"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.293173 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.295099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.296777 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4nfs6"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.298495 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hvfsp"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.312292 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.312694 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.812670606 +0000 UTC m=+218.789134230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.346012 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" event={"ID":"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a","Type":"ContainerStarted","Data":"4ccdd23e8be4d6c9539771eef6195bf56b8b0ee2c73a8319460776e9e587d6d9"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.346291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" event={"ID":"2d2512b8-9fb6-4b9b-9ac2-268e0b8bd04a","Type":"ContainerStarted","Data":"1f7c118357de68461245eae47ffb437a1633b5749182b21c0c72fdfc4c679eb2"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.347832 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" event={"ID":"6343dd4f-5c5b-4c94-a4d6-f603698ba6ac","Type":"ContainerStarted","Data":"354589bc74d0e1df42a39518ab6dbad16d637dd99da7e74ffac93a01ae49e39c"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.350862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wst2s" event={"ID":"a9ed1e0e-eff9-4690-bcf5-45f6074c200e","Type":"ContainerStarted","Data":"c582e2fab200954a9843b0514b416a8988bf80f007355b323c81a87c8930c6fa"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.351548 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" event={"ID":"fddde3da-8512-4e62-9c38-b59f98e117e0","Type":"ContainerStarted","Data":"bdc921caf55fdc9483dd16d86e33b3865a46d3e8a029b5b72e32afe0666cef10"} Mar 21 04:51:05 crc kubenswrapper[4775]: W0321 04:51:05.352323 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod918078e1_af38_475a_86d5_8179cafa18db.slice/crio-b1b80ce740efd38d0818a1758ffcc9a25de23ff4e3d9a46b3ed20d9fb4f1765f WatchSource:0}: Error finding container b1b80ce740efd38d0818a1758ffcc9a25de23ff4e3d9a46b3ed20d9fb4f1765f: Status 404 returned error can't find the container with id b1b80ce740efd38d0818a1758ffcc9a25de23ff4e3d9a46b3ed20d9fb4f1765f Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.352480 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" event={"ID":"768c9343-1391-4221-b008-5dee2921953f","Type":"ContainerStarted","Data":"378f686eba4695c392992cbcc55f4bb7d43751a0aa1f6827e445289cf7ab59ba"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.354673 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" event={"ID":"2dbc150e-3a25-4b44-b01c-effe99de5152","Type":"ContainerStarted","Data":"81cfc7994200911020eefe9ee8577746708ae643101c25b2f8cec803d2f18463"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.355473 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" event={"ID":"f9395e24-0d4b-4165-bf60-068876927f58","Type":"ContainerStarted","Data":"2f0781c3740928ce2d7eb2feb485c88f02d9ea7d5356a38d011e5a61cb72029a"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.356153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" event={"ID":"811f064c-ebf4-48ad-87a0-83205eb1eca5","Type":"ContainerStarted","Data":"c17997ba96f183ce91c800c29ded59676be28e8d3a7e31d33b00efb6bade2a36"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.357083 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" event={"ID":"892d15b6-460e-4892-a836-0cc284c8a326","Type":"ContainerStarted","Data":"16776553b2840887617e187a152ecb08b84ab564361957b0471e5c74a0b97f4f"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.357133 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" event={"ID":"892d15b6-460e-4892-a836-0cc284c8a326","Type":"ContainerStarted","Data":"575843aeea1bfa101dadda9f7c658bc97392556098e7a54921dab614fe073b20"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.359482 4775 generic.go:334] "Generic (PLEG): container finished" podID="8bb7828f-6d99-4539-8312-c8e96bfbc608" containerID="9377ec80f0257a109a3ef8a2ab177b965b751362a7bed153c19bdb616f962c5b" exitCode=0 Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.359550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" event={"ID":"8bb7828f-6d99-4539-8312-c8e96bfbc608","Type":"ContainerDied","Data":"9377ec80f0257a109a3ef8a2ab177b965b751362a7bed153c19bdb616f962c5b"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.362066 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" event={"ID":"10a9b5d9-308c-4971-a073-9c88de98d8ea","Type":"ContainerStarted","Data":"d391c9939f47b2c46b89dc19c1bfd9d652cf6bd95d985bf56a1c9557acbcc9fb"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.366828 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" event={"ID":"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4","Type":"ContainerStarted","Data":"006a4227f3b9a53c19aee02bd2ba8f8065dbb5adc9267be25a07167778e43098"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.367877 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" event={"ID":"bc2b8485-8cc1-4029-8318-397d4278e455","Type":"ContainerStarted","Data":"0f6819a8fe627dd9cd2c2a54f184a37a7e20e8bcef4d909c3ed46add017746b7"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.369274 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" event={"ID":"ed9baf4c-70b2-450f-9b21-76dfafbc44d0","Type":"ContainerStarted","Data":"1dc3c475bbed8c4530a54779f813884a49b5a06477a18a344d8eb396adfd545f"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.370373 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5rnj6" podStartSLOduration=159.370359623 podStartE2EDuration="2m39.370359623s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:05.368194341 +0000 UTC m=+218.344657975" watchObservedRunningTime="2026-03-21 04:51:05.370359623 +0000 UTC m=+218.346823247" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.371424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" event={"ID":"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5","Type":"ContainerStarted","Data":"fac4dd9f3b6b0094e30d1053e0f38da29cf7c45cc03525a6d3a097acc02a041e"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.376541 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" event={"ID":"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899","Type":"ContainerStarted","Data":"6f0ed735e075d381eb3244b0026ac1423e2945f8047e0e69ac0bbbe2cd398281"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.376589 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" event={"ID":"12b1cc27-6a60-43d2-9d5e-eb7a54c1e899","Type":"ContainerStarted","Data":"a0ddf7f265c9468b0e817c2b6ad7ddc35b80f659e965df1ca89854dfd66fbfcb"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.381096 4775 generic.go:334] "Generic (PLEG): container finished" podID="6e22b5c7-4191-4f21-82ba-3014ccc4e978" containerID="a5a2690a7310f199883212a85e22e16730b42ff31cd6f309adf16033e7c667f2" exitCode=0 Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.381191 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" event={"ID":"6e22b5c7-4191-4f21-82ba-3014ccc4e978","Type":"ContainerDied","Data":"a5a2690a7310f199883212a85e22e16730b42ff31cd6f309adf16033e7c667f2"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.381222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" event={"ID":"6e22b5c7-4191-4f21-82ba-3014ccc4e978","Type":"ContainerStarted","Data":"0d54baee9468506d654bdead2edbe067a0624cb9f3d099cb47b6df832ca221c2"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.383558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.384534 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" event={"ID":"1d080332-c215-44c1-a027-65afbc612f88","Type":"ContainerStarted","Data":"9efb482e4d2b0037c0e6660d1e9de43b853428582153c6ff1d5456d965e77cd6"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.385586 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dnv7h" event={"ID":"ecae58ef-adaa-42ea-9246-49845334a819","Type":"ContainerStarted","Data":"e174b1b5351adfc3ac99f9c2b2eb0b849a681af6c4e663fff96b061851393e99"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.385611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dnv7h" event={"ID":"ecae58ef-adaa-42ea-9246-49845334a819","Type":"ContainerStarted","Data":"60c1166ebc7144e4d0ca3bd792fa45f2a6a46e6b7dd9e914c1abbb31f09ef797"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.386376 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" event={"ID":"edb67e99-0d50-46e9-adbb-c0831dd915d8","Type":"ContainerStarted","Data":"a0107ab6b2aa658877064abe6b5ae6a91c533d203e88950d607e2469e04edc54"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.387038 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" event={"ID":"68d61e65-8275-4862-9dae-a75029889b2a","Type":"ContainerStarted","Data":"7c38f9475d7e5c854a00a5490720d86e44d20915c3af5d6a9fb49e6ae7a38696"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.402092 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2z46g" event={"ID":"a79bbac6-f40a-4c92-8854-7ab5e72573cc","Type":"ContainerStarted","Data":"a99acbaab0f76819308ab208aae43a9cd832350fb9839c3fa49ffcc6ee292566"} Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.402212 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-r6c6z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.402245 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" podUID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.403342 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-nstnr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.403374 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nstnr" podUID="c3d58eba-4ddf-463c-baa1-1943fb60c732" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.414145 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.914077058 +0000 UTC m=+218.890540672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.414215 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.415028 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.415189 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.415497 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:05.915488369 +0000 UTC m=+218.891951993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: W0321 04:51:05.442096 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02b06614_31da_49d9_bc97_cf61b065d42f.slice/crio-72a6014e8a8c00b4814a9127976a071de2c161ae17ce9251e2050d85143f64b4 WatchSource:0}: Error finding container 72a6014e8a8c00b4814a9127976a071de2c161ae17ce9251e2050d85143f64b4: Status 404 returned error can't find the container with id 72a6014e8a8c00b4814a9127976a071de2c161ae17ce9251e2050d85143f64b4 Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.451468 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" podStartSLOduration=159.451445371 podStartE2EDuration="2m39.451445371s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:05.450406302 +0000 UTC m=+218.426869926" watchObservedRunningTime="2026-03-21 04:51:05.451445371 +0000 UTC m=+218.427909005" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.516519 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.518087 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.018062885 +0000 UTC m=+218.994526549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.559993 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.590342 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.618934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.619638 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.119619601 +0000 UTC m=+219.096083245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.626594 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.636522 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.653857 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6jnn5"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.670486 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.699005 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2lnkp"] Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.719837 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.720240 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.22022331 +0000 UTC m=+219.196686934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: W0321 04:51:05.726671 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0bb62e0_24b1_4ee5_b8c0_077a8f2aeeb8.slice/crio-7a3e51d71262c4421a06e5c99d2e5a2ee3a4cb2aaf4dd59819720b466894bbfd WatchSource:0}: Error finding container 7a3e51d71262c4421a06e5c99d2e5a2ee3a4cb2aaf4dd59819720b466894bbfd: Status 404 returned error can't find the container with id 7a3e51d71262c4421a06e5c99d2e5a2ee3a4cb2aaf4dd59819720b466894bbfd Mar 21 04:51:05 crc kubenswrapper[4775]: W0321 04:51:05.732420 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5aa6958_e573_4efb_a031_218c62b0bec9.slice/crio-4a3a49e3ddeff3a00ab9609e9ae3b8c22b4ec782261392142e9e4f564705d42d WatchSource:0}: Error finding container 4a3a49e3ddeff3a00ab9609e9ae3b8c22b4ec782261392142e9e4f564705d42d: Status 404 returned error can't find the container with id 4a3a49e3ddeff3a00ab9609e9ae3b8c22b4ec782261392142e9e4f564705d42d Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.816869 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.818884 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.818930 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.821544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.821901 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.3218829 +0000 UTC m=+219.298346524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.863487 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dzxd5" podStartSLOduration=159.863468084 podStartE2EDuration="2m39.863468084s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:05.857524774 +0000 UTC m=+218.833988398" watchObservedRunningTime="2026-03-21 04:51:05.863468084 +0000 UTC m=+218.839931708" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.922480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.922624 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.422608223 +0000 UTC m=+219.399071847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.922880 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:05 crc kubenswrapper[4775]: E0321 04:51:05.923158 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.423150368 +0000 UTC m=+219.399613992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.931728 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2z46g" podStartSLOduration=159.931707564 podStartE2EDuration="2m39.931707564s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:05.930130619 +0000 UTC m=+218.906594263" watchObservedRunningTime="2026-03-21 04:51:05.931707564 +0000 UTC m=+218.908171188" Mar 21 04:51:05 crc kubenswrapper[4775]: I0321 04:51:05.932366 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-468g4" podStartSLOduration=159.932359233 podStartE2EDuration="2m39.932359233s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:05.891900791 +0000 UTC m=+218.868364415" watchObservedRunningTime="2026-03-21 04:51:05.932359233 +0000 UTC m=+218.908822857" Mar 21 04:51:05 crc kubenswrapper[4775]: W0321 04:51:05.974648 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55e67e9_9980_4878_9875_a07207894f6f.slice/crio-973fb4f2cc44ee4346ef8c713d7373ffea3fdbff448ecd56e7c8391ee6ce48e6 WatchSource:0}: Error finding container 973fb4f2cc44ee4346ef8c713d7373ffea3fdbff448ecd56e7c8391ee6ce48e6: Status 404 returned error can't find the container with id 973fb4f2cc44ee4346ef8c713d7373ffea3fdbff448ecd56e7c8391ee6ce48e6 Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.023916 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.024056 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.524030116 +0000 UTC m=+219.500493740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.024213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.024610 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.524587582 +0000 UTC m=+219.501051226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.125323 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.125540 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.62551566 +0000 UTC m=+219.601979284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.125920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.126183 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.626173719 +0000 UTC m=+219.602637343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.226460 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.226637 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.726613523 +0000 UTC m=+219.703077147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.226759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.227012 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.727001035 +0000 UTC m=+219.703464659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.327598 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.327696 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.827678206 +0000 UTC m=+219.804141830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.327785 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.328039 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.828030626 +0000 UTC m=+219.804494240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.407762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-wb89g" event={"ID":"cd25c8a4-8047-4602-a95b-3308af65bd38","Type":"ContainerStarted","Data":"05b88dca4e9352028f202490cd0cd91a11f14d92891caca0835ae23e4321056a"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.409352 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zs5pr" event={"ID":"ae87529c-e52f-45b0-9bbf-2a652e628bc5","Type":"ContainerStarted","Data":"67f32c8a2a75b425e13a6954ee446b7d5f82b03031a619ecb3e2549b702adedc"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.411063 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" event={"ID":"918078e1-af38-475a-86d5-8179cafa18db","Type":"ContainerStarted","Data":"b1b80ce740efd38d0818a1758ffcc9a25de23ff4e3d9a46b3ed20d9fb4f1765f"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.413009 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" event={"ID":"1017f8be-2192-46a4-8717-ead73ca5e81b","Type":"ContainerStarted","Data":"14694a169bb5c4955d641d259ab6b1e11c221e9de14ac903c5c305b569f98734"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.414366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" event={"ID":"dfbaac71-f99c-4373-a469-f2e5dd0ee632","Type":"ContainerStarted","Data":"73bfd1fd04d6bcc9e68b01d3bd224e4608501bca9349188673dd3d6681965588"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.415650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" event={"ID":"a5aa6958-e573-4efb-a031-218c62b0bec9","Type":"ContainerStarted","Data":"4a3a49e3ddeff3a00ab9609e9ae3b8c22b4ec782261392142e9e4f564705d42d"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.417366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" event={"ID":"9de83f0b-7dd2-4846-a1ce-c8af930778f4","Type":"ContainerStarted","Data":"2731fba8805ee077aab9635f83ab8d0734ef47f44e0bcd4695582de53fd7acba"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.418335 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" event={"ID":"e5224539-6d29-4bc3-9656-4665eb287e28","Type":"ContainerStarted","Data":"631844f264e2c2c60c82cbb55b5b9b825b4c2c41854e332cffa0322b4537e9e1"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.419380 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" event={"ID":"07720715-40fb-4f7d-8f6b-381dc7cf9ea2","Type":"ContainerStarted","Data":"e203b5acaed5b8428d41bc14f17c02edfa1433e80d67ab20e5c1162cf529a5c5"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.420590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" event={"ID":"b0a24bc9-b395-4732-ba57-b096ee9ffdb1","Type":"ContainerStarted","Data":"5630ce022553fe34ac22d879a625a130a1d5f8464075a4e9b7f945e66d1d43db"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.421971 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" event={"ID":"a49709c7-59e0-440e-89c2-177c42cd28e8","Type":"ContainerStarted","Data":"7e4b00575887da7acfd9d6e07116fb0ebf6717dff53758a618d5c9d92741f392"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.426332 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" event={"ID":"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0","Type":"ContainerStarted","Data":"d60caa6b7c834cd2c81448a4d8ff279ec53ea7cd58de54b2a9f6ffeef78f88dd"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.427599 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" event={"ID":"d55e67e9-9980-4878-9875-a07207894f6f","Type":"ContainerStarted","Data":"973fb4f2cc44ee4346ef8c713d7373ffea3fdbff448ecd56e7c8391ee6ce48e6"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.428619 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.429227 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:06.929201322 +0000 UTC m=+219.905664986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.429894 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" event={"ID":"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8","Type":"ContainerStarted","Data":"7a3e51d71262c4421a06e5c99d2e5a2ee3a4cb2aaf4dd59819720b466894bbfd"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.432056 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6jnn5" event={"ID":"35c6eec5-7117-48de-a4c5-640517da6a76","Type":"ContainerStarted","Data":"d4b3eac7e2e58a362d63264ddf58256cf443c42ec6564a89e5121145388f21ab"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.433297 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" event={"ID":"02b06614-31da-49d9-bc97-cf61b065d42f","Type":"ContainerStarted","Data":"72a6014e8a8c00b4814a9127976a071de2c161ae17ce9251e2050d85143f64b4"} Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.470556 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dnv7h" podStartSLOduration=6.470533359 podStartE2EDuration="6.470533359s" podCreationTimestamp="2026-03-21 04:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:06.469095057 +0000 UTC m=+219.445558691" watchObservedRunningTime="2026-03-21 04:51:06.470533359 +0000 UTC m=+219.446997003" Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.531539 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.533898 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.033871368 +0000 UTC m=+220.010335012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.632553 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.632852 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.13283602 +0000 UTC m=+220.109299634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.734290 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.734656 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.234635053 +0000 UTC m=+220.211098737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.817817 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.817898 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.836095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.836699 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.336646443 +0000 UTC m=+220.313110067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.838973 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.841483 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.341453101 +0000 UTC m=+220.317916725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:06 crc kubenswrapper[4775]: I0321 04:51:06.940412 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:06 crc kubenswrapper[4775]: E0321 04:51:06.940809 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.440787564 +0000 UTC m=+220.417251188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.041861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.042307 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.542286079 +0000 UTC m=+220.518749703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.143787 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.144239 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.644224706 +0000 UTC m=+220.620688330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.244876 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.245299 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.745284379 +0000 UTC m=+220.721748003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.345513 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.345700 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.845677162 +0000 UTC m=+220.822140786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.345831 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.346195 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.846183916 +0000 UTC m=+220.822647540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.438790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" event={"ID":"ed9baf4c-70b2-450f-9b21-76dfafbc44d0","Type":"ContainerStarted","Data":"cadd789c9a6b6b6dccb1af86d4359d6025e42837584cd6624c1767f1cecac34c"} Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.439934 4775 generic.go:334] "Generic (PLEG): container finished" podID="892d15b6-460e-4892-a836-0cc284c8a326" containerID="16776553b2840887617e187a152ecb08b84ab564361957b0471e5c74a0b97f4f" exitCode=0 Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.439989 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" event={"ID":"892d15b6-460e-4892-a836-0cc284c8a326","Type":"ContainerDied","Data":"16776553b2840887617e187a152ecb08b84ab564361957b0471e5c74a0b97f4f"} Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.441152 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" event={"ID":"edb4245a-7971-4c40-81b6-27d56b319a2f","Type":"ContainerStarted","Data":"93335dc6f90290b36d46a46319c144e689ee8335d4f0c54f8222753aeee4eb14"} Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.446727 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.447724 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:07.947705382 +0000 UTC m=+220.924168996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.451858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" event={"ID":"10a9b5d9-308c-4971-a073-9c88de98d8ea","Type":"ContainerStarted","Data":"c99542331ad097e2f95ddfa1e7556cebdd46fb352aef4933c487cf230490583e"} Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.453628 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" event={"ID":"edb67e99-0d50-46e9-adbb-c0831dd915d8","Type":"ContainerStarted","Data":"f3a29dd8f77e6a4b94910c0c0c74c2afe3c0b75e0fdd9a8fcf71d4f7770d5653"} Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.453789 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.459795 4775 patch_prober.go:28] interesting pod/console-operator-58897d9998-fc4z8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.459869 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" podUID="12b1cc27-6a60-43d2-9d5e-eb7a54c1e899" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.470901 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" podStartSLOduration=161.470886928 podStartE2EDuration="2m41.470886928s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:07.469065805 +0000 UTC m=+220.445529429" watchObservedRunningTime="2026-03-21 04:51:07.470886928 +0000 UTC m=+220.447350552" Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.484530 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5pdcr" podStartSLOduration=161.484513379 podStartE2EDuration="2m41.484513379s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:07.484285242 +0000 UTC m=+220.460748886" watchObservedRunningTime="2026-03-21 04:51:07.484513379 +0000 UTC m=+220.460977003" Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.550465 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.050412722 +0000 UTC m=+221.026876346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.549753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.652465 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.652934 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.152912565 +0000 UTC m=+221.129376189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.753783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.754072 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.254060519 +0000 UTC m=+221.230524143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.818484 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.818577 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.855200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.855456 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.35542881 +0000 UTC m=+221.331892434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.855681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.855948 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.355941035 +0000 UTC m=+221.332404659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.957357 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.957646 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.457608795 +0000 UTC m=+221.434072419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:07 crc kubenswrapper[4775]: I0321 04:51:07.957755 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:07 crc kubenswrapper[4775]: E0321 04:51:07.958436 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.458424608 +0000 UTC m=+221.434888412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.059834 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.059993 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.559963044 +0000 UTC m=+221.536426668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.060302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.060727 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.560717136 +0000 UTC m=+221.537180930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.162420 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.162585 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.662554081 +0000 UTC m=+221.639017705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.162805 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.163274 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.663266561 +0000 UTC m=+221.639730175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.264694 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.264901 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.764870689 +0000 UTC m=+221.741334313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.265069 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.265430 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.765420685 +0000 UTC m=+221.741884309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.368629 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.369018 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.86899978 +0000 UTC m=+221.845463404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.460433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" event={"ID":"b0b465ff-f8f3-4a99-9235-1ddc4ce093e4","Type":"ContainerStarted","Data":"18c05d7f58a99612dc5efe5ad65e1a4f010d0505662765c2b5f2cf9c83622c20"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.462044 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" event={"ID":"eddac2a3-8b07-410d-9e9f-f7ce0c05abf5","Type":"ContainerStarted","Data":"9242d5c32349be04a03c56f0b3585e84f1337e2ab99ece891d25c14234eae8d5"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.463705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" event={"ID":"8bb7828f-6d99-4539-8312-c8e96bfbc608","Type":"ContainerStarted","Data":"590ca6e3c97d44acdce1b2b5e4b30315b8b8022f7cd8a1624210df48524edcf8"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.466568 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" event={"ID":"b0a24bc9-b395-4732-ba57-b096ee9ffdb1","Type":"ContainerStarted","Data":"a312561e8254adf11616d36ba39133cb34b7dc608bc2da8feb55af6233140c65"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.468001 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" event={"ID":"02b06614-31da-49d9-bc97-cf61b065d42f","Type":"ContainerStarted","Data":"cbc98787ecc0c6c4832727fda1f187cf8a193cad3b2d5a9a0ebc67dae7ea78ea"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.469194 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" event={"ID":"edb4245a-7971-4c40-81b6-27d56b319a2f","Type":"ContainerStarted","Data":"9691da391ada20ada3b71bb2d7e26ca8ba0f89f4dce37b834b05ee3c9e082918"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.469655 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.469976 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:08.969963159 +0000 UTC m=+221.946426783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.470686 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" event={"ID":"918078e1-af38-475a-86d5-8179cafa18db","Type":"ContainerStarted","Data":"d43ab3ec1001720c6f472efab2a3f71284b8cb8e3357689550716f3a74fd014e"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.472331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" event={"ID":"fddde3da-8512-4e62-9c38-b59f98e117e0","Type":"ContainerStarted","Data":"e7db5d2af2536f63217dcf1801711995965b013eb5a3eabb8f70b3e311b18d76"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.473825 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" event={"ID":"1d080332-c215-44c1-a027-65afbc612f88","Type":"ContainerStarted","Data":"0d2a7ce003a897c99bcb1c1e9cbeaa9b5ad53f73f02e92d0cc9aaa2361fe518d"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.475189 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" event={"ID":"811f064c-ebf4-48ad-87a0-83205eb1eca5","Type":"ContainerStarted","Data":"91b9d97978afcef7d317d0be26aeb6112ab84e003138a0924c602ef35b20e3c5"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.476271 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" event={"ID":"0c0dc3b9-8e11-4ff0-8f26-549ce47215f0","Type":"ContainerStarted","Data":"ecf2efe54a36b3fb6058b5bfe0d55546e6f3136149f8eabafeb20ba7213485d9"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.477402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" event={"ID":"d55e67e9-9980-4878-9875-a07207894f6f","Type":"ContainerStarted","Data":"3d338a745024ac75c81bf2abc4d2ef739c654f6537a34419b6ab721180950f6d"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.478498 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" event={"ID":"bc2b8485-8cc1-4029-8318-397d4278e455","Type":"ContainerStarted","Data":"9edced4268caed98927ad877592a67e465613569cffdc221ea601b6eae827973"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.479807 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" event={"ID":"f9395e24-0d4b-4165-bf60-068876927f58","Type":"ContainerStarted","Data":"25df534758847bc5ccc6009699a59047abe78191ef225b4f013fadcba9be1f14"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.481531 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" event={"ID":"dfbaac71-f99c-4373-a469-f2e5dd0ee632","Type":"ContainerStarted","Data":"e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.484326 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" event={"ID":"2dbc150e-3a25-4b44-b01c-effe99de5152","Type":"ContainerStarted","Data":"5623ba27f455dcdbf40b96e8e00a12d1541682430e1b057b616d215082ae7128"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.488995 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" event={"ID":"a5aa6958-e573-4efb-a031-218c62b0bec9","Type":"ContainerStarted","Data":"1b93f4361b64d716502128254e9ea25b78ba9ed772325baf5c04d50a5a77ab40"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.490925 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" event={"ID":"e5224539-6d29-4bc3-9656-4665eb287e28","Type":"ContainerStarted","Data":"75b230198fb6a70ca3ef179e91486c0ac56c1b4e6cb9288659a34229d0e5d109"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.492249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" event={"ID":"68d61e65-8275-4862-9dae-a75029889b2a","Type":"ContainerStarted","Data":"0fab95b37a2e6ed33a86e59e397871dbda6fae2eaae9335694addbaa577daf2b"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.493409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" event={"ID":"a49709c7-59e0-440e-89c2-177c42cd28e8","Type":"ContainerStarted","Data":"8d28f7922639a7fd05cf65cac943f2930631c08e489a583966597a6f307142d7"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.494500 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zs5pr" event={"ID":"ae87529c-e52f-45b0-9bbf-2a652e628bc5","Type":"ContainerStarted","Data":"e2ab3430d2b9e7725b9fe97cfaf6740b73f817210e59fff19d1b2eb8d4004cce"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.495627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6jnn5" event={"ID":"35c6eec5-7117-48de-a4c5-640517da6a76","Type":"ContainerStarted","Data":"8b385a6616a741e413b81fd1ad0ea83118f452733c49d9fa7d2ed79f401bb413"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.508042 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" event={"ID":"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8","Type":"ContainerStarted","Data":"e12324fff0bd481ca62c439bc0330f51748fe72d84ac697cfe3adde23279b170"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.510663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" event={"ID":"07720715-40fb-4f7d-8f6b-381dc7cf9ea2","Type":"ContainerStarted","Data":"4b385123fb6a15d90b1cec5c84204e7c26de4b5079a1b283ad23d8b734086ae6"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.512329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" event={"ID":"892d15b6-460e-4892-a836-0cc284c8a326","Type":"ContainerStarted","Data":"849182efddb8c3413751c9d3506283faddbabe30d171beba88cbea21627385e0"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.513725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wst2s" event={"ID":"a9ed1e0e-eff9-4690-bcf5-45f6074c200e","Type":"ContainerStarted","Data":"4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.515597 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" event={"ID":"1017f8be-2192-46a4-8717-ead73ca5e81b","Type":"ContainerStarted","Data":"af2a3164a7ad4571de890915ec43f2b5bddc2dae35ae42a0b0637e6a7790886d"} Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.517351 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.517383 4775 patch_prober.go:28] interesting pod/console-operator-58897d9998-fc4z8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.517416 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" podUID="12b1cc27-6a60-43d2-9d5e-eb7a54c1e899" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.517949 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-55lcp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.517979 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" podUID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.541219 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xv7nw" podStartSLOduration=162.541186064 podStartE2EDuration="2m42.541186064s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:08.53892791 +0000 UTC m=+221.515391534" watchObservedRunningTime="2026-03-21 04:51:08.541186064 +0000 UTC m=+221.517649688" Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.576616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.577884 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.077864508 +0000 UTC m=+222.054328132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.619100 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" podStartSLOduration=162.619078251 podStartE2EDuration="2m42.619078251s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:08.599717255 +0000 UTC m=+221.576180879" watchObservedRunningTime="2026-03-21 04:51:08.619078251 +0000 UTC m=+221.595541875" Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.620019 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9v866" podStartSLOduration=162.620012448 podStartE2EDuration="2m42.620012448s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:08.617344482 +0000 UTC m=+221.593808116" watchObservedRunningTime="2026-03-21 04:51:08.620012448 +0000 UTC m=+221.596476082" Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.678234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.678684 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.178667743 +0000 UTC m=+222.155131367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.779528 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.779678 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.279659553 +0000 UTC m=+222.256123177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.780139 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.780413 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.280405204 +0000 UTC m=+222.256868828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.839639 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:08 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:08 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:08 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.839697 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.881740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.882088 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.382073694 +0000 UTC m=+222.358537318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:08 crc kubenswrapper[4775]: I0321 04:51:08.983676 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:08 crc kubenswrapper[4775]: E0321 04:51:08.983965 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.48395441 +0000 UTC m=+222.460418024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.084891 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.085052 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.585021403 +0000 UTC m=+222.561485027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.085132 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.085439 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.585428304 +0000 UTC m=+222.561891928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.186536 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.186731 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.686705983 +0000 UTC m=+222.663169607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.186783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.187169 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.687156866 +0000 UTC m=+222.663620490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.287965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.288198 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.788176987 +0000 UTC m=+222.764640611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.288310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.288667 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.788658011 +0000 UTC m=+222.765121635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.388973 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.389143 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.889094535 +0000 UTC m=+222.865558159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.389285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.389716 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.889679592 +0000 UTC m=+222.866143216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.492793 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.493366 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:09.993340559 +0000 UTC m=+222.969804193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.520773 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-55lcp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.520844 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" podUID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.522044 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.522236 4775 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jd86b container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.522280 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.544354 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" podStartSLOduration=163.544338904 podStartE2EDuration="2m43.544338904s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.542671386 +0000 UTC m=+222.519135000" watchObservedRunningTime="2026-03-21 04:51:09.544338904 +0000 UTC m=+222.520802528" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.558498 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" podStartSLOduration=163.55847487 podStartE2EDuration="2m43.55847487s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.557740609 +0000 UTC m=+222.534204243" watchObservedRunningTime="2026-03-21 04:51:09.55847487 +0000 UTC m=+222.534938504" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.591782 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" podStartSLOduration=163.591764096 podStartE2EDuration="2m43.591764096s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.590561951 +0000 UTC m=+222.567025575" watchObservedRunningTime="2026-03-21 04:51:09.591764096 +0000 UTC m=+222.568227720" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.594752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.612903 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.112877822 +0000 UTC m=+223.089341446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.622446 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" podStartSLOduration=163.622425346 podStartE2EDuration="2m43.622425346s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.618634198 +0000 UTC m=+222.595097842" watchObservedRunningTime="2026-03-21 04:51:09.622425346 +0000 UTC m=+222.598888970" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.639505 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4nfs6" podStartSLOduration=163.639488926 podStartE2EDuration="2m43.639488926s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.638536049 +0000 UTC m=+222.614999673" watchObservedRunningTime="2026-03-21 04:51:09.639488926 +0000 UTC m=+222.615952550" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.696651 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.696809 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.196778862 +0000 UTC m=+223.173242486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.697190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.697433 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.19742288 +0000 UTC m=+223.173886504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.707325 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" podStartSLOduration=163.707307634 podStartE2EDuration="2m43.707307634s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.65841997 +0000 UTC m=+222.634883584" watchObservedRunningTime="2026-03-21 04:51:09.707307634 +0000 UTC m=+222.683771268" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.708539 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" podStartSLOduration=163.708531789 podStartE2EDuration="2m43.708531789s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.704586586 +0000 UTC m=+222.681050210" watchObservedRunningTime="2026-03-21 04:51:09.708531789 +0000 UTC m=+222.684995413" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.745204 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" podStartSLOduration=163.745185092 podStartE2EDuration="2m43.745185092s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.72492891 +0000 UTC m=+222.701392544" watchObservedRunningTime="2026-03-21 04:51:09.745185092 +0000 UTC m=+222.721648716" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.763076 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gd4gc" podStartSLOduration=163.763056275 podStartE2EDuration="2m43.763056275s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.762352885 +0000 UTC m=+222.738816509" watchObservedRunningTime="2026-03-21 04:51:09.763056275 +0000 UTC m=+222.739519909" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.763453 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6jnn5" podStartSLOduration=9.763446106 podStartE2EDuration="9.763446106s" podCreationTimestamp="2026-03-21 04:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.748560139 +0000 UTC m=+222.725023763" watchObservedRunningTime="2026-03-21 04:51:09.763446106 +0000 UTC m=+222.739909730" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.796837 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5ztsd" podStartSLOduration=163.796819175 podStartE2EDuration="2m43.796819175s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.783290996 +0000 UTC m=+222.759754640" watchObservedRunningTime="2026-03-21 04:51:09.796819175 +0000 UTC m=+222.773282799" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.798254 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.798389 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.298372169 +0000 UTC m=+223.274835793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.798650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.798957 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.298946066 +0000 UTC m=+223.275409740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.799558 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-725jh" podStartSLOduration=163.799548873 podStartE2EDuration="2m43.799548873s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.796631849 +0000 UTC m=+222.773095483" watchObservedRunningTime="2026-03-21 04:51:09.799548873 +0000 UTC m=+222.776012497" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.816724 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-blwmr" podStartSLOduration=163.816706336 podStartE2EDuration="2m43.816706336s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.812474414 +0000 UTC m=+222.788938048" watchObservedRunningTime="2026-03-21 04:51:09.816706336 +0000 UTC m=+222.793169960" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.819543 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:09 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:09 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:09 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.819595 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.828417 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wqjnj" podStartSLOduration=163.828401372 podStartE2EDuration="2m43.828401372s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.828312459 +0000 UTC m=+222.804776083" watchObservedRunningTime="2026-03-21 04:51:09.828401372 +0000 UTC m=+222.804864996" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.856975 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" podStartSLOduration=163.856957682 podStartE2EDuration="2m43.856957682s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.855763178 +0000 UTC m=+222.832226802" watchObservedRunningTime="2026-03-21 04:51:09.856957682 +0000 UTC m=+222.833421306" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.858053 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7xpn" podStartSLOduration=163.858046443 podStartE2EDuration="2m43.858046443s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.844315769 +0000 UTC m=+222.820779383" watchObservedRunningTime="2026-03-21 04:51:09.858046443 +0000 UTC m=+222.834510077" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.891089 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2zh8p" podStartSLOduration=163.891072892 podStartE2EDuration="2m43.891072892s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.874030452 +0000 UTC m=+222.850494076" watchObservedRunningTime="2026-03-21 04:51:09.891072892 +0000 UTC m=+222.867536516" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.892725 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wst2s" podStartSLOduration=163.892719319 podStartE2EDuration="2m43.892719319s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:09.889345702 +0000 UTC m=+222.865809326" watchObservedRunningTime="2026-03-21 04:51:09.892719319 +0000 UTC m=+222.869182943" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.899639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.899778 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.399759441 +0000 UTC m=+223.376223065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.899890 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:09 crc kubenswrapper[4775]: E0321 04:51:09.900184 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.400177913 +0000 UTC m=+223.376641537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.925298 4775 ???:1] "http: TLS handshake error from 192.168.126.11:43608: no serving certificate available for the kubelet" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.966030 4775 ???:1] "http: TLS handshake error from 192.168.126.11:43610: no serving certificate available for the kubelet" Mar 21 04:51:09 crc kubenswrapper[4775]: I0321 04:51:09.991979 4775 ???:1] "http: TLS handshake error from 192.168.126.11:43622: no serving certificate available for the kubelet" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.001002 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.001201 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.501174904 +0000 UTC m=+223.477638528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.060806 4775 ???:1] "http: TLS handshake error from 192.168.126.11:43630: no serving certificate available for the kubelet" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.079137 4775 ???:1] "http: TLS handshake error from 192.168.126.11:43640: no serving certificate available for the kubelet" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.102427 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.102800 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.602782952 +0000 UTC m=+223.579246576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.164100 4775 ???:1] "http: TLS handshake error from 192.168.126.11:43644: no serving certificate available for the kubelet" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.203616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.204050 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.704015729 +0000 UTC m=+223.680479393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.305661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.306069 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.806053569 +0000 UTC m=+223.782517193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.351065 4775 ???:1] "http: TLS handshake error from 192.168.126.11:43648: no serving certificate available for the kubelet" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.407324 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.407646 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:10.907595896 +0000 UTC m=+223.884059520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.508998 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.509423 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.00940382 +0000 UTC m=+223.985867444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.527923 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" event={"ID":"f0bb62e0-24b1-4ee5-b8c0-077a8f2aeeb8","Type":"ContainerStarted","Data":"a8337c07a9f91d9d03dd6039d3198c7ccdfcdfb956af5d440f3c4e0bd2d75826"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.531156 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" event={"ID":"6e22b5c7-4191-4f21-82ba-3014ccc4e978","Type":"ContainerStarted","Data":"9d744d38f317071bd769b3e7072f8fab149a6bbf4171f825b3b0d5e6cdaf2588"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.536020 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zs5pr" event={"ID":"ae87529c-e52f-45b0-9bbf-2a652e628bc5","Type":"ContainerStarted","Data":"8ad0a1aa3f4ed910a583b987476e23f0c356f3beb7b29b09ef9f390ee28dfdcd"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.536158 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.538044 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" event={"ID":"b0a24bc9-b395-4732-ba57-b096ee9ffdb1","Type":"ContainerStarted","Data":"719ce0747f05487b39ed1cd009257ecf52b6ac8ce8de87d8bcfdf50d6d1e4082"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.543885 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" event={"ID":"02b06614-31da-49d9-bc97-cf61b065d42f","Type":"ContainerStarted","Data":"defd41f64a00fac497123059d0e52b8cafbca23fb2bba9bf796fb94315df2e30"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.550205 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vsh8z" podStartSLOduration=164.550184291 podStartE2EDuration="2m44.550184291s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.54843468 +0000 UTC m=+223.524898324" watchObservedRunningTime="2026-03-21 04:51:10.550184291 +0000 UTC m=+223.526647915" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.554437 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" event={"ID":"bc2b8485-8cc1-4029-8318-397d4278e455","Type":"ContainerStarted","Data":"1e8daaf27e70ace4c3af569a43c51111922c374bd73158cd17848aaae6a12515"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.555070 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.562869 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" event={"ID":"68d61e65-8275-4862-9dae-a75029889b2a","Type":"ContainerStarted","Data":"ac1cbe68dd73ea5c8caacc3b618dba0ce5e706de8e4108c1b4c621d135d2097c"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.575955 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" event={"ID":"811f064c-ebf4-48ad-87a0-83205eb1eca5","Type":"ContainerStarted","Data":"422091c765551c45cf53928bcb6bbf7cb726ca790f73aa24731ad99b2ab51f79"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.576069 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8kpqj" podStartSLOduration=164.576044923 podStartE2EDuration="2m44.576044923s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.571909935 +0000 UTC m=+223.548373559" watchObservedRunningTime="2026-03-21 04:51:10.576044923 +0000 UTC m=+223.552508547" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.588323 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" event={"ID":"edb4245a-7971-4c40-81b6-27d56b319a2f","Type":"ContainerStarted","Data":"f0dec679b3b89379b4d0e57d8a3c9e1005a582f726c8219969bad32fa27e3e7a"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.602356 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" event={"ID":"2dbc150e-3a25-4b44-b01c-effe99de5152","Type":"ContainerStarted","Data":"c79b3392d38d1173c38edbc74e51d7e1ebb61bf920208dfdec4b3116dbff3597"} Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.603108 4775 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jd86b container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.603176 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.607751 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2lnkp" podStartSLOduration=164.607734333 podStartE2EDuration="2m44.607734333s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.607176277 +0000 UTC m=+223.583639911" watchObservedRunningTime="2026-03-21 04:51:10.607734333 +0000 UTC m=+223.584197947" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.610243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.610348 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.110325938 +0000 UTC m=+224.086789562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.611457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.613344 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.113332014 +0000 UTC m=+224.089795638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.660894 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7sfkh" podStartSLOduration=164.66087484 podStartE2EDuration="2m44.66087484s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.659709236 +0000 UTC m=+223.636172860" watchObservedRunningTime="2026-03-21 04:51:10.66087484 +0000 UTC m=+223.637338464" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.662246 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zs5pr" podStartSLOduration=10.662237269 podStartE2EDuration="10.662237269s" podCreationTimestamp="2026-03-21 04:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.643922493 +0000 UTC m=+223.620386117" watchObservedRunningTime="2026-03-21 04:51:10.662237269 +0000 UTC m=+223.638700893" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.684264 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" podStartSLOduration=164.684241251 podStartE2EDuration="2m44.684241251s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.683547311 +0000 UTC m=+223.660010935" watchObservedRunningTime="2026-03-21 04:51:10.684241251 +0000 UTC m=+223.660704875" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.697010 4775 ???:1] "http: TLS handshake error from 192.168.126.11:34200: no serving certificate available for the kubelet" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.702399 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-57cxh" podStartSLOduration=164.702378942 podStartE2EDuration="2m44.702378942s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.701534417 +0000 UTC m=+223.677998041" watchObservedRunningTime="2026-03-21 04:51:10.702378942 +0000 UTC m=+223.678842566" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.712758 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.714690 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.214675545 +0000 UTC m=+224.191139169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.740125 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vjv4q" podStartSLOduration=164.739965931 podStartE2EDuration="2m44.739965931s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.736862802 +0000 UTC m=+223.713326436" watchObservedRunningTime="2026-03-21 04:51:10.739965931 +0000 UTC m=+223.716429555" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.765906 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-k5dks" podStartSLOduration=164.765872565 podStartE2EDuration="2m44.765872565s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:10.763797045 +0000 UTC m=+223.740260669" watchObservedRunningTime="2026-03-21 04:51:10.765872565 +0000 UTC m=+223.742336189" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.817958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.818290 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.31827768 +0000 UTC m=+224.294741294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.822609 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:10 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:10 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:10 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.822643 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:10 crc kubenswrapper[4775]: I0321 04:51:10.918948 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:10 crc kubenswrapper[4775]: E0321 04:51:10.919327 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.419313152 +0000 UTC m=+224.395776776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.020186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.020510 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.520497308 +0000 UTC m=+224.496960932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.121067 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.121256 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.62122932 +0000 UTC m=+224.597692944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.121335 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.121662 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.621653173 +0000 UTC m=+224.598116797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.222564 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.222763 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.722737216 +0000 UTC m=+224.699200850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.223075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.223451 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.723441266 +0000 UTC m=+224.699904880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.324134 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.324576 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.824555429 +0000 UTC m=+224.801019053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.390791 4775 ???:1] "http: TLS handshake error from 192.168.126.11:34206: no serving certificate available for the kubelet" Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.426295 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.426609 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:11.926593589 +0000 UTC m=+224.903057213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.527408 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.528191 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.028169156 +0000 UTC m=+225.004632780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.621443 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" event={"ID":"6e22b5c7-4191-4f21-82ba-3014ccc4e978","Type":"ContainerStarted","Data":"ea1b2f10b0c4b51db9b95325c0c76b92a590974e1b40882e8f0862257853dcab"} Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.627460 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" event={"ID":"9de83f0b-7dd2-4846-a1ce-c8af930778f4","Type":"ContainerStarted","Data":"c74b513c5d5a5ec41643b4b209c53ff6b64295bf34309a4ab65491776584358e"} Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.630073 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.630420 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.130406712 +0000 UTC m=+225.106870326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.678209 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" podStartSLOduration=165.678191085 podStartE2EDuration="2m45.678191085s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:11.677260368 +0000 UTC m=+224.653724002" watchObservedRunningTime="2026-03-21 04:51:11.678191085 +0000 UTC m=+224.654654709" Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.731466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.732971 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.232952067 +0000 UTC m=+225.209415691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.826756 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:11 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:11 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:11 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.826815 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.833728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.834194 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.334173344 +0000 UTC m=+225.310636968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.935542 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.935730 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.43570224 +0000 UTC m=+225.412165864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:11 crc kubenswrapper[4775]: I0321 04:51:11.935886 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:11 crc kubenswrapper[4775]: E0321 04:51:11.936215 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.436206065 +0000 UTC m=+225.412669689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.036961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.037097 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.537080342 +0000 UTC m=+225.513543956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.037290 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.037608 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.537597957 +0000 UTC m=+225.514061591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.138883 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.139056 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.63902916 +0000 UTC m=+225.615492784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.139213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.139524 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.639514994 +0000 UTC m=+225.615978618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.168779 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.170895 4775 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mthhs container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.170934 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" podUID="892d15b6-460e-4892-a836-0cc284c8a326" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.170898 4775 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mthhs container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.171081 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" podUID="892d15b6-460e-4892-a836-0cc284c8a326" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.171290 4775 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mthhs container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.171325 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" podUID="892d15b6-460e-4892-a836-0cc284c8a326" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.240239 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.240399 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.74037579 +0000 UTC m=+225.716839414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.240534 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.240764 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.740756651 +0000 UTC m=+225.717220275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.341517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.341839 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.841825904 +0000 UTC m=+225.818289528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.443227 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.443549 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:12.943538065 +0000 UTC m=+225.920001689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.544079 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.544502 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.044469553 +0000 UTC m=+226.020933177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.544851 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.545224 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.045208885 +0000 UTC m=+226.021672529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.645649 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.646071 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.146050991 +0000 UTC m=+226.122514625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.743125 4775 ???:1] "http: TLS handshake error from 192.168.126.11:34214: no serving certificate available for the kubelet" Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.747157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.748366 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.248348949 +0000 UTC m=+226.224812573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.819737 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:12 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:12 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:12 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.819809 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.848187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.848610 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.348592988 +0000 UTC m=+226.325056612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.950328 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:12 crc kubenswrapper[4775]: E0321 04:51:12.950745 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.450725481 +0000 UTC m=+226.427189105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:12 crc kubenswrapper[4775]: I0321 04:51:12.956025 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.023916 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-nstnr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.023982 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nstnr" podUID="c3d58eba-4ddf-463c-baa1-1943fb60c732" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.024349 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-nstnr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.024404 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nstnr" podUID="c3d58eba-4ddf-463c-baa1-1943fb60c732" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.051873 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.052083 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.552052021 +0000 UTC m=+226.528515645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.052338 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.052716 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.55270489 +0000 UTC m=+226.529168524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.154071 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.154415 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.65438887 +0000 UTC m=+226.630852514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.154560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.154852 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.654845313 +0000 UTC m=+226.631308937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.157745 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.157849 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.255997 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.256392 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.756365978 +0000 UTC m=+226.732829602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.256536 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.256893 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.756875503 +0000 UTC m=+226.733339127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.258673 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.258707 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.259788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.260539 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.261625 4775 patch_prober.go:28] interesting pod/console-f9d7485db-wst2s container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.261656 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wst2s" podUID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.261634 4775 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rz6g5 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.27:8443/livez\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.261727 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" podUID="6e22b5c7-4191-4f21-82ba-3014ccc4e978" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.27:8443/livez\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.266901 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fc4z8" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.308040 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.357413 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.358746 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.858724528 +0000 UTC m=+226.835188152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.433781 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.435619 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wrzs4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.435684 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wrzs4 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.435910 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.435806 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.436479 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wrzs4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.436569 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.459227 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.459678 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:13.959661327 +0000 UTC m=+226.936124951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.471336 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.560378 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.561352 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.061319736 +0000 UTC m=+227.037783360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.647874 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" event={"ID":"9de83f0b-7dd2-4846-a1ce-c8af930778f4","Type":"ContainerStarted","Data":"dac9469a3446348249e3c427c296e28a0c4f474cc224cec4d3ca45da6d0d9051"} Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.661825 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.662147 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.162135732 +0000 UTC m=+227.138599356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.713627 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.762724 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.764451 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.26443127 +0000 UTC m=+227.240894884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.774711 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v296h" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.810154 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.812774 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.813069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.815422 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.815732 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.816422 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.824990 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:13 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:13 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:13 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.825371 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.826486 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.866673 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.868725 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.368710264 +0000 UTC m=+227.345173898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.906467 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.920433 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kxtlf" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.971425 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.971589 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.471562428 +0000 UTC m=+227.448026052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.973304 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.974076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:13 crc kubenswrapper[4775]: I0321 04:51:13.974271 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:13 crc kubenswrapper[4775]: E0321 04:51:13.976752 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.476734827 +0000 UTC m=+227.453198451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.008564 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-959kj"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.009572 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.017076 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.053242 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-959kj"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.075881 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.076194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.076276 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.076341 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.076409 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.576395609 +0000 UTC m=+227.552859233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.133183 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.133451 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.178854 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-utilities\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.179147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.179180 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ww6h\" (UniqueName: \"kubernetes.io/projected/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-kube-api-access-7ww6h\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.179205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-catalog-content\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.179478 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.679466159 +0000 UTC m=+227.655929783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.200028 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wbtgv"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.201011 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.205387 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.238899 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbtgv"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.258355 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.281646 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zkl7p" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.281859 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.281971 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.781944292 +0000 UTC m=+227.758407916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282237 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282266 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-catalog-content\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282293 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ww6h\" (UniqueName: \"kubernetes.io/projected/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-kube-api-access-7ww6h\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92fzw\" (UniqueName: \"kubernetes.io/projected/ccb910ab-dcef-4523-81df-c0fb5eb83429-kube-api-access-92fzw\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282338 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-catalog-content\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282355 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-utilities\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-utilities\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.282800 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-catalog-content\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.282934 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.78291995 +0000 UTC m=+227.759383634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.283034 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-utilities\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.353455 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ww6h\" (UniqueName: \"kubernetes.io/projected/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-kube-api-access-7ww6h\") pod \"certified-operators-959kj\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.383764 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.383959 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92fzw\" (UniqueName: \"kubernetes.io/projected/ccb910ab-dcef-4523-81df-c0fb5eb83429-kube-api-access-92fzw\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.384014 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-utilities\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.384110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-catalog-content\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.384827 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-catalog-content\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.384899 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.884884548 +0000 UTC m=+227.861348172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.385845 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-utilities\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.395990 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v5fgf"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.415890 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.426443 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v5fgf"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.445785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92fzw\" (UniqueName: \"kubernetes.io/projected/ccb910ab-dcef-4523-81df-c0fb5eb83429-kube-api-access-92fzw\") pod \"community-operators-wbtgv\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.485319 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.485407 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-catalog-content\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.485432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-utilities\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.485464 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98dgw\" (UniqueName: \"kubernetes.io/projected/de54a48a-b733-4042-80b6-ecc719712314-kube-api-access-98dgw\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.485672 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:14.985656112 +0000 UTC m=+227.962119736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.540190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.589661 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.589824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98dgw\" (UniqueName: \"kubernetes.io/projected/de54a48a-b733-4042-80b6-ecc719712314-kube-api-access-98dgw\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.589913 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-catalog-content\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.589943 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-utilities\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.590372 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-utilities\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.590446 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.090430671 +0000 UTC m=+228.066894295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.591238 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-catalog-content\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.621883 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-svf9w"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.623012 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.630359 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98dgw\" (UniqueName: \"kubernetes.io/projected/de54a48a-b733-4042-80b6-ecc719712314-kube-api-access-98dgw\") pod \"certified-operators-v5fgf\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.635596 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svf9w"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.644229 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.645446 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.696947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-catalog-content\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.697043 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.697068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wq48\" (UniqueName: \"kubernetes.io/projected/1dcbca72-150a-47c6-ac3c-f701ae82e05b-kube-api-access-4wq48\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.697103 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-utilities\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.701501 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.201483461 +0000 UTC m=+228.177947085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.754416 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" event={"ID":"9de83f0b-7dd2-4846-a1ce-c8af930778f4","Type":"ContainerStarted","Data":"262e7c1fa0b36c90633bf27e707ad1c5a6e35614948e91b3bd14fdfe9c8383ff"} Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.764627 4775 generic.go:334] "Generic (PLEG): container finished" podID="a5aa6958-e573-4efb-a031-218c62b0bec9" containerID="1b93f4361b64d716502128254e9ea25b78ba9ed772325baf5c04d50a5a77ab40" exitCode=0 Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.765853 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" event={"ID":"a5aa6958-e573-4efb-a031-218c62b0bec9","Type":"ContainerDied","Data":"1b93f4361b64d716502128254e9ea25b78ba9ed772325baf5c04d50a5a77ab40"} Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.813263 4775 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sp69c container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": context deadline exceeded" start-of-body= Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.813642 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" podUID="a49709c7-59e0-440e-89c2-177c42cd28e8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": context deadline exceeded" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.813260 4775 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-sp69c container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.814049 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" podUID="a49709c7-59e0-440e-89c2-177c42cd28e8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.814604 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.814855 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wq48\" (UniqueName: \"kubernetes.io/projected/1dcbca72-150a-47c6-ac3c-f701ae82e05b-kube-api-access-4wq48\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.814903 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-utilities\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.814945 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-catalog-content\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.815397 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-catalog-content\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.815464 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.315451084 +0000 UTC m=+228.291914708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.815841 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-utilities\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.832605 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-sp69c" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.834024 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:14 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:14 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:14 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.834085 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.847598 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.851899 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wq48\" (UniqueName: \"kubernetes.io/projected/1dcbca72-150a-47c6-ac3c-f701ae82e05b-kube-api-access-4wq48\") pod \"community-operators-svf9w\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.922457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:14 crc kubenswrapper[4775]: E0321 04:51:14.926786 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.426756299 +0000 UTC m=+228.403219923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.940489 4775 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.944858 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6c6z"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.945089 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" podUID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" containerName="controller-manager" containerID="cri-o://2f124547d6c6b6e4cadb63b6c1732b81d7643f21b62f22762426569eabbfc7c4" gracePeriod=30 Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.960171 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.986513 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp"] Mar 21 04:51:14 crc kubenswrapper[4775]: I0321 04:51:14.987008 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" podUID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" containerName="route-controller-manager" containerID="cri-o://cadd789c9a6b6b6dccb1af86d4359d6025e42837584cd6624c1767f1cecac34c" gracePeriod=30 Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.024357 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.024747 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.524730353 +0000 UTC m=+228.501193977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.128864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.129240 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.629226224 +0000 UTC m=+228.605689848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.173161 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbtgv"] Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.181512 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mthhs" Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.230045 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.730010439 +0000 UTC m=+228.706474063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.230149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.230777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.231173 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.73110591 +0000 UTC m=+228.707569534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.248128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-959kj"] Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.331704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.332778 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.832763739 +0000 UTC m=+228.809227363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.335136 4775 ???:1] "http: TLS handshake error from 192.168.126.11:34228: no serving certificate available for the kubelet" Mar 21 04:51:15 crc kubenswrapper[4775]: W0321 04:51:15.383226 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bb65dee_cd5f_46b3_9e7d_36e5d182d19e.slice/crio-68f05861e537811c2b45eab073d2762025cd79d47c769cec2072bbee0ee4a322 WatchSource:0}: Error finding container 68f05861e537811c2b45eab073d2762025cd79d47c769cec2072bbee0ee4a322: Status 404 returned error can't find the container with id 68f05861e537811c2b45eab073d2762025cd79d47c769cec2072bbee0ee4a322 Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.398348 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v5fgf"] Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.433463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.433915 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:15.933895874 +0000 UTC m=+228.910359508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.530461 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-svf9w"] Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.536775 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.537206 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:16.03719001 +0000 UTC m=+229.013653634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.563282 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.564099 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.566358 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.566625 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.584964 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.638464 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.638730 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:16.138719666 +0000 UTC m=+229.115183290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.739110 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.739342 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:51:16.239290125 +0000 UTC m=+229.215753749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.739507 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/829c993f-4963-416a-8d66-15cf19a68237-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.739588 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.739696 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/829c993f-4963-416a-8d66-15cf19a68237-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: E0321 04:51:15.739907 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:51:16.239894712 +0000 UTC m=+229.216358336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hvd8t" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.789978 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5fgf" event={"ID":"de54a48a-b733-4042-80b6-ecc719712314","Type":"ContainerStarted","Data":"ade6fa22510b087e9954c736cf8d4e9716ee7ec2fc194bbd254a73a9267ce630"} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.791406 4775 generic.go:334] "Generic (PLEG): container finished" podID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" containerID="2f124547d6c6b6e4cadb63b6c1732b81d7643f21b62f22762426569eabbfc7c4" exitCode=0 Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.791472 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" event={"ID":"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb","Type":"ContainerDied","Data":"2f124547d6c6b6e4cadb63b6c1732b81d7643f21b62f22762426569eabbfc7c4"} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.792357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svf9w" event={"ID":"1dcbca72-150a-47c6-ac3c-f701ae82e05b","Type":"ContainerStarted","Data":"e49106633d57c504f2a2b545d76a34559e4299f9b5c7c912c88ab2862f4e055a"} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.793147 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbtgv" event={"ID":"ccb910ab-dcef-4523-81df-c0fb5eb83429","Type":"ContainerStarted","Data":"c9000a4269c650286aea4fb24ec488fdb1ea75dd0febdc2786cb6046e9da1f69"} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.794770 4775 generic.go:334] "Generic (PLEG): container finished" podID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" containerID="cadd789c9a6b6b6dccb1af86d4359d6025e42837584cd6624c1767f1cecac34c" exitCode=0 Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.794813 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" event={"ID":"ed9baf4c-70b2-450f-9b21-76dfafbc44d0","Type":"ContainerDied","Data":"cadd789c9a6b6b6dccb1af86d4359d6025e42837584cd6624c1767f1cecac34c"} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.795604 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-959kj" event={"ID":"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e","Type":"ContainerStarted","Data":"68f05861e537811c2b45eab073d2762025cd79d47c769cec2072bbee0ee4a322"} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.796603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0f1cf896-5a0b-43e5-b36b-ba705e02825f","Type":"ContainerStarted","Data":"9a39bc499dfd59cb84df2219ab5f515cedb58c236d866f34ae7e0567a7995091"} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.813920 4775 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-21T04:51:14.940509444Z","Handler":null,"Name":""} Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.826646 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:15 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:15 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:15 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.826714 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.840240 4775 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.840281 4775 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.841559 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.841992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/829c993f-4963-416a-8d66-15cf19a68237-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.842104 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/829c993f-4963-416a-8d66-15cf19a68237-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.842255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/829c993f-4963-416a-8d66-15cf19a68237-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.864366 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.874504 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/829c993f-4963-416a-8d66-15cf19a68237-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.889811 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.943230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.973214 4775 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 21 04:51:15 crc kubenswrapper[4775]: I0321 04:51:15.973271 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.040321 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hvd8t\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.174393 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:51:16 crc kubenswrapper[4775]: W0321 04:51:16.228102 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod829c993f_4963_416a_8d66_15cf19a68237.slice/crio-8ee6fc7e5443705634d1b844abee45703b8c3f989c11bb3335bb9bbcf4253920 WatchSource:0}: Error finding container 8ee6fc7e5443705634d1b844abee45703b8c3f989c11bb3335bb9bbcf4253920: Status 404 returned error can't find the container with id 8ee6fc7e5443705634d1b844abee45703b8c3f989c11bb3335bb9bbcf4253920 Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.230786 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.289674 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.296964 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.299284 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.347991 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm9fk\" (UniqueName: \"kubernetes.io/projected/a5aa6958-e573-4efb-a031-218c62b0bec9-kube-api-access-wm9fk\") pod \"a5aa6958-e573-4efb-a031-218c62b0bec9\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.348084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aa6958-e573-4efb-a031-218c62b0bec9-config-volume\") pod \"a5aa6958-e573-4efb-a031-218c62b0bec9\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.348196 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5aa6958-e573-4efb-a031-218c62b0bec9-secret-volume\") pod \"a5aa6958-e573-4efb-a031-218c62b0bec9\" (UID: \"a5aa6958-e573-4efb-a031-218c62b0bec9\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.350996 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5aa6958-e573-4efb-a031-218c62b0bec9-config-volume" (OuterVolumeSpecName: "config-volume") pod "a5aa6958-e573-4efb-a031-218c62b0bec9" (UID: "a5aa6958-e573-4efb-a031-218c62b0bec9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.354378 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5aa6958-e573-4efb-a031-218c62b0bec9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a5aa6958-e573-4efb-a031-218c62b0bec9" (UID: "a5aa6958-e573-4efb-a031-218c62b0bec9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.357514 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5aa6958-e573-4efb-a031-218c62b0bec9-kube-api-access-wm9fk" (OuterVolumeSpecName: "kube-api-access-wm9fk") pod "a5aa6958-e573-4efb-a031-218c62b0bec9" (UID: "a5aa6958-e573-4efb-a031-218c62b0bec9"). InnerVolumeSpecName "kube-api-access-wm9fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.394893 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hldm7"] Mar 21 04:51:16 crc kubenswrapper[4775]: E0321 04:51:16.395202 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5aa6958-e573-4efb-a031-218c62b0bec9" containerName="collect-profiles" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.395216 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5aa6958-e573-4efb-a031-218c62b0bec9" containerName="collect-profiles" Mar 21 04:51:16 crc kubenswrapper[4775]: E0321 04:51:16.395232 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" containerName="controller-manager" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.395238 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" containerName="controller-manager" Mar 21 04:51:16 crc kubenswrapper[4775]: E0321 04:51:16.395248 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" containerName="route-controller-manager" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.395255 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" containerName="route-controller-manager" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.395347 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5aa6958-e573-4efb-a031-218c62b0bec9" containerName="collect-profiles" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.395365 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" containerName="controller-manager" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.395376 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" containerName="route-controller-manager" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.396035 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.397856 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.426014 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hldm7"] Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449261 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-client-ca\") pod \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449304 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zb5h\" (UniqueName: \"kubernetes.io/projected/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-kube-api-access-2zb5h\") pod \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449347 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7h98\" (UniqueName: \"kubernetes.io/projected/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-kube-api-access-r7h98\") pod \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449384 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-proxy-ca-bundles\") pod \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449413 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-config\") pod \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449601 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-config\") pod \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449630 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-serving-cert\") pod \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\" (UID: \"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-client-ca\") pod \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-serving-cert\") pod \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\" (UID: \"ed9baf4c-70b2-450f-9b21-76dfafbc44d0\") " Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449915 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5aa6958-e573-4efb-a031-218c62b0bec9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449927 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5aa6958-e573-4efb-a031-218c62b0bec9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.449937 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm9fk\" (UniqueName: \"kubernetes.io/projected/a5aa6958-e573-4efb-a031-218c62b0bec9-kube-api-access-wm9fk\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.450529 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "ed9baf4c-70b2-450f-9b21-76dfafbc44d0" (UID: "ed9baf4c-70b2-450f-9b21-76dfafbc44d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.450668 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-config" (OuterVolumeSpecName: "config") pod "ed9baf4c-70b2-450f-9b21-76dfafbc44d0" (UID: "ed9baf4c-70b2-450f-9b21-76dfafbc44d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.451586 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-client-ca" (OuterVolumeSpecName: "client-ca") pod "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" (UID: "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.452778 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-config" (OuterVolumeSpecName: "config") pod "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" (UID: "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.452847 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" (UID: "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.454291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ed9baf4c-70b2-450f-9b21-76dfafbc44d0" (UID: "ed9baf4c-70b2-450f-9b21-76dfafbc44d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.454835 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-kube-api-access-r7h98" (OuterVolumeSpecName: "kube-api-access-r7h98") pod "ed9baf4c-70b2-450f-9b21-76dfafbc44d0" (UID: "ed9baf4c-70b2-450f-9b21-76dfafbc44d0"). InnerVolumeSpecName "kube-api-access-r7h98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.454920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" (UID: "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.456202 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-kube-api-access-2zb5h" (OuterVolumeSpecName: "kube-api-access-2zb5h") pod "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" (UID: "f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb"). InnerVolumeSpecName "kube-api-access-2zb5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.541862 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hvd8t"] Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552005 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gdv6\" (UniqueName: \"kubernetes.io/projected/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-kube-api-access-2gdv6\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552376 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-utilities\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-catalog-content\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552740 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552766 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552777 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552788 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552801 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zb5h\" (UniqueName: \"kubernetes.io/projected/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-kube-api-access-2zb5h\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552816 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7h98\" (UniqueName: \"kubernetes.io/projected/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-kube-api-access-r7h98\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552828 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552841 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9baf4c-70b2-450f-9b21-76dfafbc44d0-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.552946 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:16 crc kubenswrapper[4775]: W0321 04:51:16.573558 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60cd23b8_6e1c_492e_aeb5_8b16609d06d1.slice/crio-224b2bc46f2dabb71d4e0596c8a7cb1f7d3a4099e704fb5ab5882b52edcb5f8a WatchSource:0}: Error finding container 224b2bc46f2dabb71d4e0596c8a7cb1f7d3a4099e704fb5ab5882b52edcb5f8a: Status 404 returned error can't find the container with id 224b2bc46f2dabb71d4e0596c8a7cb1f7d3a4099e704fb5ab5882b52edcb5f8a Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.654487 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gdv6\" (UniqueName: \"kubernetes.io/projected/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-kube-api-access-2gdv6\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.655872 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-utilities\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.656388 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-utilities\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.656852 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-catalog-content\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.658319 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-catalog-content\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.676381 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gdv6\" (UniqueName: \"kubernetes.io/projected/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-kube-api-access-2gdv6\") pod \"redhat-marketplace-hldm7\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.720604 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.794691 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7k9df"] Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.796474 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.805842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" event={"ID":"60cd23b8-6e1c-492e-aeb5-8b16609d06d1","Type":"ContainerStarted","Data":"126ca17ac1331046f38ef3e95c39b37b4252bef89c1a0511f475a2f5ad46576e"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.805892 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" event={"ID":"60cd23b8-6e1c-492e-aeb5-8b16609d06d1","Type":"ContainerStarted","Data":"224b2bc46f2dabb71d4e0596c8a7cb1f7d3a4099e704fb5ab5882b52edcb5f8a"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.806336 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.808650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" event={"ID":"9de83f0b-7dd2-4846-a1ce-c8af930778f4","Type":"ContainerStarted","Data":"82d98e149ba56e4e52ec45de064c3a5f0fa406e38d48f08786a198d4c528ab79"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.810663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" event={"ID":"ed9baf4c-70b2-450f-9b21-76dfafbc44d0","Type":"ContainerDied","Data":"1dc3c475bbed8c4530a54779f813884a49b5a06477a18a344d8eb396adfd545f"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.810701 4775 scope.go:117] "RemoveContainer" containerID="cadd789c9a6b6b6dccb1af86d4359d6025e42837584cd6624c1767f1cecac34c" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.810786 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.817501 4775 generic.go:334] "Generic (PLEG): container finished" podID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerID="c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca" exitCode=0 Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.817593 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-959kj" event={"ID":"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e","Type":"ContainerDied","Data":"c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.827099 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:16 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:16 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:16 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.827167 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.857717 4775 generic.go:334] "Generic (PLEG): container finished" podID="0f1cf896-5a0b-43e5-b36b-ba705e02825f" containerID="4b733bcb1d8da38520110c5a997c872fa32572b803088ac2576ee01c4664fef2" exitCode=0 Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.857835 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0f1cf896-5a0b-43e5-b36b-ba705e02825f","Type":"ContainerDied","Data":"4b733bcb1d8da38520110c5a997c872fa32572b803088ac2576ee01c4664fef2"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.862215 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" event={"ID":"f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb","Type":"ContainerDied","Data":"3aa7cdf6ad840aefd2de6cc5405b71b516de12c087774e2fad897251db2fc6ea"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.862257 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r6c6z" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.864464 4775 generic.go:334] "Generic (PLEG): container finished" podID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerID="00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc" exitCode=0 Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.864526 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svf9w" event={"ID":"1dcbca72-150a-47c6-ac3c-f701ae82e05b","Type":"ContainerDied","Data":"00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.870487 4775 scope.go:117] "RemoveContainer" containerID="2f124547d6c6b6e4cadb63b6c1732b81d7643f21b62f22762426569eabbfc7c4" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.871728 4775 generic.go:334] "Generic (PLEG): container finished" podID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerID="9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426" exitCode=0 Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.871786 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbtgv" event={"ID":"ccb910ab-dcef-4523-81df-c0fb5eb83429","Type":"ContainerDied","Data":"9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.874684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"829c993f-4963-416a-8d66-15cf19a68237","Type":"ContainerStarted","Data":"7a7b97953f09a11e97eecb3805a985f1f57d9595e025c3794916c5e3633729ac"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.874726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"829c993f-4963-416a-8d66-15cf19a68237","Type":"ContainerStarted","Data":"8ee6fc7e5443705634d1b844abee45703b8c3f989c11bb3335bb9bbcf4253920"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.878942 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7k9df"] Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.893822 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" event={"ID":"a5aa6958-e573-4efb-a031-218c62b0bec9","Type":"ContainerDied","Data":"4a3a49e3ddeff3a00ab9609e9ae3b8c22b4ec782261392142e9e4f564705d42d"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.893864 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a3a49e3ddeff3a00ab9609e9ae3b8c22b4ec782261392142e9e4f564705d42d" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.893938 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.903406 4775 generic.go:334] "Generic (PLEG): container finished" podID="de54a48a-b733-4042-80b6-ecc719712314" containerID="c42abfff124b04fcc4cb97eb8de832fa3a68e69db6ccb668907d78050544eb6c" exitCode=0 Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.903449 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5fgf" event={"ID":"de54a48a-b733-4042-80b6-ecc719712314","Type":"ContainerDied","Data":"c42abfff124b04fcc4cb97eb8de832fa3a68e69db6ccb668907d78050544eb6c"} Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.942478 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" podStartSLOduration=170.942457028 podStartE2EDuration="2m50.942457028s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:16.932263165 +0000 UTC m=+229.908726789" watchObservedRunningTime="2026-03-21 04:51:16.942457028 +0000 UTC m=+229.918920662" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.944664 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hvfsp" podStartSLOduration=16.944640551 podStartE2EDuration="16.944640551s" podCreationTimestamp="2026-03-21 04:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:16.880144289 +0000 UTC m=+229.856607933" watchObservedRunningTime="2026-03-21 04:51:16.944640551 +0000 UTC m=+229.921104175" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.963295 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6c6z"] Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.966293 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-catalog-content\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.966329 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm7zf\" (UniqueName: \"kubernetes.io/projected/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-kube-api-access-mm7zf\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.966368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-utilities\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.971289 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r6c6z"] Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.975688 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp"] Mar 21 04:51:16 crc kubenswrapper[4775]: I0321 04:51:16.979270 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-55lcp"] Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.060957 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hldm7"] Mar 21 04:51:17 crc kubenswrapper[4775]: W0321 04:51:17.061949 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod571e84f2_a2bc_4f09_ac53_d4a4adafa80b.slice/crio-615ed550bf3a5c04a2f6727d10336a359668c9d0754381b6a62de7c552e1b1b1 WatchSource:0}: Error finding container 615ed550bf3a5c04a2f6727d10336a359668c9d0754381b6a62de7c552e1b1b1: Status 404 returned error can't find the container with id 615ed550bf3a5c04a2f6727d10336a359668c9d0754381b6a62de7c552e1b1b1 Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.063750 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.063735851 podStartE2EDuration="2.063735851s" podCreationTimestamp="2026-03-21 04:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:17.0563957 +0000 UTC m=+230.032859334" watchObservedRunningTime="2026-03-21 04:51:17.063735851 +0000 UTC m=+230.040199475" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.067938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-catalog-content\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.067982 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm7zf\" (UniqueName: \"kubernetes.io/projected/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-kube-api-access-mm7zf\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.068031 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-utilities\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.068739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-utilities\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.068762 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-catalog-content\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.099441 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm7zf\" (UniqueName: \"kubernetes.io/projected/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-kube-api-access-mm7zf\") pod \"redhat-marketplace-7k9df\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.164243 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.191368 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wjfwd"] Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.192489 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.194665 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.213778 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjfwd"] Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.271052 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-utilities\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.271220 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-catalog-content\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.271356 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpppf\" (UniqueName: \"kubernetes.io/projected/70ad413a-5f81-4094-b2d8-9b89698c6e32-kube-api-access-vpppf\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.372828 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-utilities\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.373328 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-catalog-content\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.373400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpppf\" (UniqueName: \"kubernetes.io/projected/70ad413a-5f81-4094-b2d8-9b89698c6e32-kube-api-access-vpppf\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.373715 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-catalog-content\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.374398 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-utilities\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.395018 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cfvcz"] Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.396046 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.399781 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpppf\" (UniqueName: \"kubernetes.io/projected/70ad413a-5f81-4094-b2d8-9b89698c6e32-kube-api-access-vpppf\") pod \"redhat-operators-wjfwd\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.402705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfvcz"] Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.429342 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7k9df"] Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.474567 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-catalog-content\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.474674 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdggw\" (UniqueName: \"kubernetes.io/projected/96069bfd-088a-4053-ab37-76a04683a6a6-kube-api-access-kdggw\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.474759 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-utilities\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.539551 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.576173 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-catalog-content\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.576283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdggw\" (UniqueName: \"kubernetes.io/projected/96069bfd-088a-4053-ab37-76a04683a6a6-kube-api-access-kdggw\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.576336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-utilities\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.577091 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-utilities\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.577287 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-catalog-content\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.593404 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdggw\" (UniqueName: \"kubernetes.io/projected/96069bfd-088a-4053-ab37-76a04683a6a6-kube-api-access-kdggw\") pod \"redhat-operators-cfvcz\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.669320 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.670314 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed9baf4c-70b2-450f-9b21-76dfafbc44d0" path="/var/lib/kubelet/pods/ed9baf4c-70b2-450f-9b21-76dfafbc44d0/volumes" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.670967 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb" path="/var/lib/kubelet/pods/f56fd43d-f6e7-4e8c-ab18-3d5599eefcbb/volumes" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.719397 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.819102 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:17 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:17 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:17 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.819161 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.913605 4775 generic.go:334] "Generic (PLEG): container finished" podID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerID="ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be" exitCode=0 Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.913674 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hldm7" event={"ID":"571e84f2-a2bc-4f09-ac53-d4a4adafa80b","Type":"ContainerDied","Data":"ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be"} Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.913743 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hldm7" event={"ID":"571e84f2-a2bc-4f09-ac53-d4a4adafa80b","Type":"ContainerStarted","Data":"615ed550bf3a5c04a2f6727d10336a359668c9d0754381b6a62de7c552e1b1b1"} Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.916769 4775 generic.go:334] "Generic (PLEG): container finished" podID="829c993f-4963-416a-8d66-15cf19a68237" containerID="7a7b97953f09a11e97eecb3805a985f1f57d9595e025c3794916c5e3633729ac" exitCode=0 Mar 21 04:51:17 crc kubenswrapper[4775]: I0321 04:51:17.916826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"829c993f-4963-416a-8d66-15cf19a68237","Type":"ContainerDied","Data":"7a7b97953f09a11e97eecb3805a985f1f57d9595e025c3794916c5e3633729ac"} Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.295898 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.301920 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rz6g5" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.560298 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zs5pr" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.824950 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:18 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:18 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:18 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.825039 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.870099 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b67498df7-tct4k"] Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.871382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.872216 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx"] Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.873016 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.877725 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.877884 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.878025 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.878044 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.878460 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.878506 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.878610 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.878648 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.879455 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.879884 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.880190 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.880319 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.886666 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.886935 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx"] Mar 21 04:51:18 crc kubenswrapper[4775]: I0321 04:51:18.891927 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b67498df7-tct4k"] Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.004175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcgzs\" (UniqueName: \"kubernetes.io/projected/cff31a00-04ec-4558-963b-58851e3f2a4b-kube-api-access-rcgzs\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.004290 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cde2211-4b17-48c3-a414-d931756a8aed-serving-cert\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.004328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cff31a00-04ec-4558-963b-58851e3f2a4b-serving-cert\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.004351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-config\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.004379 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-client-ca\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.004654 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-config\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.005737 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-client-ca\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.005794 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-proxy-ca-bundles\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.005944 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr8fr\" (UniqueName: \"kubernetes.io/projected/1cde2211-4b17-48c3-a414-d931756a8aed-kube-api-access-pr8fr\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107735 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-client-ca\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107794 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-proxy-ca-bundles\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr8fr\" (UniqueName: \"kubernetes.io/projected/1cde2211-4b17-48c3-a414-d931756a8aed-kube-api-access-pr8fr\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcgzs\" (UniqueName: \"kubernetes.io/projected/cff31a00-04ec-4558-963b-58851e3f2a4b-kube-api-access-rcgzs\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107909 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cde2211-4b17-48c3-a414-d931756a8aed-serving-cert\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107930 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cff31a00-04ec-4558-963b-58851e3f2a4b-serving-cert\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107948 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-config\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.107968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-client-ca\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.108000 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-config\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.109765 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-client-ca\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.110075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-config\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.111475 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-client-ca\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.111545 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-proxy-ca-bundles\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.116220 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-config\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.118162 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cde2211-4b17-48c3-a414-d931756a8aed-serving-cert\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.128431 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcgzs\" (UniqueName: \"kubernetes.io/projected/cff31a00-04ec-4558-963b-58851e3f2a4b-kube-api-access-rcgzs\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.132921 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr8fr\" (UniqueName: \"kubernetes.io/projected/1cde2211-4b17-48c3-a414-d931756a8aed-kube-api-access-pr8fr\") pod \"controller-manager-b67498df7-tct4k\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.137025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cff31a00-04ec-4558-963b-58851e3f2a4b-serving-cert\") pod \"route-controller-manager-84c957496c-2zjbx\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.216482 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.226092 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.819197 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:19 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:19 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:19 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.819266 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:19 crc kubenswrapper[4775]: I0321 04:51:19.922662 4775 ???:1] "http: TLS handshake error from 192.168.126.11:34234: no serving certificate available for the kubelet" Mar 21 04:51:20 crc kubenswrapper[4775]: I0321 04:51:20.477166 4775 ???:1] "http: TLS handshake error from 192.168.126.11:34248: no serving certificate available for the kubelet" Mar 21 04:51:20 crc kubenswrapper[4775]: I0321 04:51:20.818386 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:20 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:20 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:20 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:20 crc kubenswrapper[4775]: I0321 04:51:20.818471 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:20 crc kubenswrapper[4775]: I0321 04:51:20.913623 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:20 crc kubenswrapper[4775]: I0321 04:51:20.939402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"829c993f-4963-416a-8d66-15cf19a68237","Type":"ContainerDied","Data":"8ee6fc7e5443705634d1b844abee45703b8c3f989c11bb3335bb9bbcf4253920"} Mar 21 04:51:20 crc kubenswrapper[4775]: I0321 04:51:20.939450 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ee6fc7e5443705634d1b844abee45703b8c3f989c11bb3335bb9bbcf4253920" Mar 21 04:51:20 crc kubenswrapper[4775]: I0321 04:51:20.939517 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.039973 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/829c993f-4963-416a-8d66-15cf19a68237-kubelet-dir\") pod \"829c993f-4963-416a-8d66-15cf19a68237\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.040052 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/829c993f-4963-416a-8d66-15cf19a68237-kube-api-access\") pod \"829c993f-4963-416a-8d66-15cf19a68237\" (UID: \"829c993f-4963-416a-8d66-15cf19a68237\") " Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.040229 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/829c993f-4963-416a-8d66-15cf19a68237-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "829c993f-4963-416a-8d66-15cf19a68237" (UID: "829c993f-4963-416a-8d66-15cf19a68237"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.040684 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/829c993f-4963-416a-8d66-15cf19a68237-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.045293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829c993f-4963-416a-8d66-15cf19a68237-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "829c993f-4963-416a-8d66-15cf19a68237" (UID: "829c993f-4963-416a-8d66-15cf19a68237"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.142234 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/829c993f-4963-416a-8d66-15cf19a68237-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.819734 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:21 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:21 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:21 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:21 crc kubenswrapper[4775]: I0321 04:51:21.819817 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:22 crc kubenswrapper[4775]: W0321 04:51:22.276276 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e2733e_3620_4ecb_a51e_33b1fd3dce9d.slice/crio-5477663a3a0fc934d7191139776ed4c58017ab46346a0a37602d99108fd8b86f WatchSource:0}: Error finding container 5477663a3a0fc934d7191139776ed4c58017ab46346a0a37602d99108fd8b86f: Status 404 returned error can't find the container with id 5477663a3a0fc934d7191139776ed4c58017ab46346a0a37602d99108fd8b86f Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.320998 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.481263 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kube-api-access\") pod \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.481318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kubelet-dir\") pod \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\" (UID: \"0f1cf896-5a0b-43e5-b36b-ba705e02825f\") " Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.481798 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0f1cf896-5a0b-43e5-b36b-ba705e02825f" (UID: "0f1cf896-5a0b-43e5-b36b-ba705e02825f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.485588 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0f1cf896-5a0b-43e5-b36b-ba705e02825f" (UID: "0f1cf896-5a0b-43e5-b36b-ba705e02825f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.583523 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.583868 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f1cf896-5a0b-43e5-b36b-ba705e02825f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.820025 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:22 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:22 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:22 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.820093 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.950926 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7k9df" event={"ID":"55e2733e-3620-4ecb-a51e-33b1fd3dce9d","Type":"ContainerStarted","Data":"5477663a3a0fc934d7191139776ed4c58017ab46346a0a37602d99108fd8b86f"} Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.963066 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0f1cf896-5a0b-43e5-b36b-ba705e02825f","Type":"ContainerDied","Data":"9a39bc499dfd59cb84df2219ab5f515cedb58c236d866f34ae7e0567a7995091"} Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.963103 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a39bc499dfd59cb84df2219ab5f515cedb58c236d866f34ae7e0567a7995091" Mar 21 04:51:22 crc kubenswrapper[4775]: I0321 04:51:22.963171 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.024349 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-nstnr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.024404 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nstnr" podUID="c3d58eba-4ddf-463c-baa1-1943fb60c732" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.024451 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-nstnr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.024496 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nstnr" podUID="c3d58eba-4ddf-463c-baa1-1943fb60c732" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.259914 4775 patch_prober.go:28] interesting pod/console-f9d7485db-wst2s container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.259965 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wst2s" podUID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.437512 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.818769 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:23 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:23 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:23 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:23 crc kubenswrapper[4775]: I0321 04:51:23.818827 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.004718 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.023394 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6920413a-2c51-466d-a16e-d14489ae0c6c-metrics-certs\") pod \"network-metrics-daemon-xk9f5\" (UID: \"6920413a-2c51-466d-a16e-d14489ae0c6c\") " pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.292305 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk9f5" Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.359209 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjfwd"] Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.374967 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfvcz"] Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.488312 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx"] Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.496908 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b67498df7-tct4k"] Mar 21 04:51:24 crc kubenswrapper[4775]: W0321 04:51:24.532670 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cde2211_4b17_48c3_a414_d931756a8aed.slice/crio-26e33d93d40acbae83163a34763d11a2400dc57b95504cce6365dbca75d924d3 WatchSource:0}: Error finding container 26e33d93d40acbae83163a34763d11a2400dc57b95504cce6365dbca75d924d3: Status 404 returned error can't find the container with id 26e33d93d40acbae83163a34763d11a2400dc57b95504cce6365dbca75d924d3 Mar 21 04:51:24 crc kubenswrapper[4775]: W0321 04:51:24.543112 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff31a00_04ec_4558_963b_58851e3f2a4b.slice/crio-3bb4a60ce62110a0343ce33bf20b360542634587af7c8fc08808e3e39e831dc8 WatchSource:0}: Error finding container 3bb4a60ce62110a0343ce33bf20b360542634587af7c8fc08808e3e39e831dc8: Status 404 returned error can't find the container with id 3bb4a60ce62110a0343ce33bf20b360542634587af7c8fc08808e3e39e831dc8 Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.557415 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xk9f5"] Mar 21 04:51:24 crc kubenswrapper[4775]: W0321 04:51:24.594893 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6920413a_2c51_466d_a16e_d14489ae0c6c.slice/crio-fa19292a4f7eb2825da51194d18d10fd9a53360b9a311b276873cb1f81047c61 WatchSource:0}: Error finding container fa19292a4f7eb2825da51194d18d10fd9a53360b9a311b276873cb1f81047c61: Status 404 returned error can't find the container with id fa19292a4f7eb2825da51194d18d10fd9a53360b9a311b276873cb1f81047c61 Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.822349 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:24 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:24 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:24 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.822876 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.984959 4775 generic.go:334] "Generic (PLEG): container finished" podID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerID="2fca8a116d2bdda5fb8662833d28064b756cf76d00f4ea921c85b01d826cd2be" exitCode=0 Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.985036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7k9df" event={"ID":"55e2733e-3620-4ecb-a51e-33b1fd3dce9d","Type":"ContainerDied","Data":"2fca8a116d2bdda5fb8662833d28064b756cf76d00f4ea921c85b01d826cd2be"} Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.988236 4775 generic.go:334] "Generic (PLEG): container finished" podID="96069bfd-088a-4053-ab37-76a04683a6a6" containerID="e2fe2d90884acf37e8aab163e4c967e9d771b2b2c054a404750ee68dc72d022c" exitCode=0 Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.988299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfvcz" event={"ID":"96069bfd-088a-4053-ab37-76a04683a6a6","Type":"ContainerDied","Data":"e2fe2d90884acf37e8aab163e4c967e9d771b2b2c054a404750ee68dc72d022c"} Mar 21 04:51:24 crc kubenswrapper[4775]: I0321 04:51:24.988325 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfvcz" event={"ID":"96069bfd-088a-4053-ab37-76a04683a6a6","Type":"ContainerStarted","Data":"778fa4db06d864ab1638a1a9d9660eefb0a2a7f44eb1587a31bef24f86e53ae8"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.014828 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" event={"ID":"6920413a-2c51-466d-a16e-d14489ae0c6c","Type":"ContainerStarted","Data":"fa19292a4f7eb2825da51194d18d10fd9a53360b9a311b276873cb1f81047c61"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.025942 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" event={"ID":"1cde2211-4b17-48c3-a414-d931756a8aed","Type":"ContainerStarted","Data":"c026a9d086ef440232842548092ab223e1b082ca25729b9aaf1abf4ff41201dd"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.025998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" event={"ID":"1cde2211-4b17-48c3-a414-d931756a8aed","Type":"ContainerStarted","Data":"26e33d93d40acbae83163a34763d11a2400dc57b95504cce6365dbca75d924d3"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.026236 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.032573 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-wb89g" event={"ID":"cd25c8a4-8047-4602-a95b-3308af65bd38","Type":"ContainerStarted","Data":"5c917ed3fbfe2c82bdf6a98ab031c5d0f484f2833ad0eb7cad0c0b48f3a9128d"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.035203 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.038841 4775 generic.go:334] "Generic (PLEG): container finished" podID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerID="3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b" exitCode=0 Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.039003 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjfwd" event={"ID":"70ad413a-5f81-4094-b2d8-9b89698c6e32","Type":"ContainerDied","Data":"3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.039098 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjfwd" event={"ID":"70ad413a-5f81-4094-b2d8-9b89698c6e32","Type":"ContainerStarted","Data":"f3ec71e4b8bce118032bd210cfae1bbf7c53e6c8a5aba3f4f045bc35e188b3c0"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.045668 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" podStartSLOduration=10.045648962 podStartE2EDuration="10.045648962s" podCreationTimestamp="2026-03-21 04:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:25.04455906 +0000 UTC m=+238.021022694" watchObservedRunningTime="2026-03-21 04:51:25.045648962 +0000 UTC m=+238.022112586" Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.067069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" event={"ID":"cff31a00-04ec-4558-963b-58851e3f2a4b","Type":"ContainerStarted","Data":"adb4a7b613a2923551c3395853cd07c6da93648bbb3bbab0261ec4ce5d8387a6"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.067144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" event={"ID":"cff31a00-04ec-4558-963b-58851e3f2a4b","Type":"ContainerStarted","Data":"3bb4a60ce62110a0343ce33bf20b360542634587af7c8fc08808e3e39e831dc8"} Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.068345 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.147828 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567810-wb89g" podStartSLOduration=66.414388366 podStartE2EDuration="1m25.147808425s" podCreationTimestamp="2026-03-21 04:50:00 +0000 UTC" firstStartedPulling="2026-03-21 04:51:05.415107708 +0000 UTC m=+218.391571332" lastFinishedPulling="2026-03-21 04:51:24.148527767 +0000 UTC m=+237.124991391" observedRunningTime="2026-03-21 04:51:25.117833325 +0000 UTC m=+238.094296949" watchObservedRunningTime="2026-03-21 04:51:25.147808425 +0000 UTC m=+238.124272049" Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.179728 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" podStartSLOduration=10.179707132 podStartE2EDuration="10.179707132s" podCreationTimestamp="2026-03-21 04:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:25.172639859 +0000 UTC m=+238.149103483" watchObservedRunningTime="2026-03-21 04:51:25.179707132 +0000 UTC m=+238.156170756" Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.249042 4775 csr.go:261] certificate signing request csr-2dhqr is approved, waiting to be issued Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.252828 4775 csr.go:257] certificate signing request csr-2dhqr is issued Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.380441 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.820722 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:25 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:25 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:25 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:25 crc kubenswrapper[4775]: I0321 04:51:25.821044 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:26 crc kubenswrapper[4775]: I0321 04:51:26.085457 4775 generic.go:334] "Generic (PLEG): container finished" podID="cd25c8a4-8047-4602-a95b-3308af65bd38" containerID="5c917ed3fbfe2c82bdf6a98ab031c5d0f484f2833ad0eb7cad0c0b48f3a9128d" exitCode=0 Mar 21 04:51:26 crc kubenswrapper[4775]: I0321 04:51:26.085531 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-wb89g" event={"ID":"cd25c8a4-8047-4602-a95b-3308af65bd38","Type":"ContainerDied","Data":"5c917ed3fbfe2c82bdf6a98ab031c5d0f484f2833ad0eb7cad0c0b48f3a9128d"} Mar 21 04:51:26 crc kubenswrapper[4775]: I0321 04:51:26.091579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" event={"ID":"6920413a-2c51-466d-a16e-d14489ae0c6c","Type":"ContainerStarted","Data":"3718ea3bd9f7e58a94bd2fb8fd34403fdbb7b24fef3a69e3f327b3911687ac6d"} Mar 21 04:51:26 crc kubenswrapper[4775]: I0321 04:51:26.254544 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 20:44:19.809368839 +0000 UTC Mar 21 04:51:26 crc kubenswrapper[4775]: I0321 04:51:26.254586 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6567h52m53.554785339s for next certificate rotation Mar 21 04:51:26 crc kubenswrapper[4775]: I0321 04:51:26.823343 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:26 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:26 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:26 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:26 crc kubenswrapper[4775]: I0321 04:51:26.823414 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.106742 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xk9f5" event={"ID":"6920413a-2c51-466d-a16e-d14489ae0c6c","Type":"ContainerStarted","Data":"f2d2a3e252ec6c2fff72b9621cb27e67c2d7171b52aeea09f4251be691845e7c"} Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.459231 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-wb89g" Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.477164 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xk9f5" podStartSLOduration=181.477143531 podStartE2EDuration="3m1.477143531s" podCreationTimestamp="2026-03-21 04:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:51:27.122132365 +0000 UTC m=+240.098595989" watchObservedRunningTime="2026-03-21 04:51:27.477143531 +0000 UTC m=+240.453607155" Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.592987 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm89b\" (UniqueName: \"kubernetes.io/projected/cd25c8a4-8047-4602-a95b-3308af65bd38-kube-api-access-gm89b\") pod \"cd25c8a4-8047-4602-a95b-3308af65bd38\" (UID: \"cd25c8a4-8047-4602-a95b-3308af65bd38\") " Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.603771 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd25c8a4-8047-4602-a95b-3308af65bd38-kube-api-access-gm89b" (OuterVolumeSpecName: "kube-api-access-gm89b") pod "cd25c8a4-8047-4602-a95b-3308af65bd38" (UID: "cd25c8a4-8047-4602-a95b-3308af65bd38"). InnerVolumeSpecName "kube-api-access-gm89b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.695986 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm89b\" (UniqueName: \"kubernetes.io/projected/cd25c8a4-8047-4602-a95b-3308af65bd38-kube-api-access-gm89b\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.820244 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:27 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:27 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:27 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:27 crc kubenswrapper[4775]: I0321 04:51:27.820294 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:28 crc kubenswrapper[4775]: I0321 04:51:28.117909 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-wb89g" event={"ID":"cd25c8a4-8047-4602-a95b-3308af65bd38","Type":"ContainerDied","Data":"05b88dca4e9352028f202490cd0cd91a11f14d92891caca0835ae23e4321056a"} Mar 21 04:51:28 crc kubenswrapper[4775]: I0321 04:51:28.117976 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b88dca4e9352028f202490cd0cd91a11f14d92891caca0835ae23e4321056a" Mar 21 04:51:28 crc kubenswrapper[4775]: I0321 04:51:28.117932 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-wb89g" Mar 21 04:51:28 crc kubenswrapper[4775]: I0321 04:51:28.819602 4775 patch_prober.go:28] interesting pod/router-default-5444994796-2z46g container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:51:28 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Mar 21 04:51:28 crc kubenswrapper[4775]: [+]process-running ok Mar 21 04:51:28 crc kubenswrapper[4775]: healthz check failed Mar 21 04:51:28 crc kubenswrapper[4775]: I0321 04:51:28.819678 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2z46g" podUID="a79bbac6-f40a-4c92-8854-7ab5e72573cc" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:51:29 crc kubenswrapper[4775]: I0321 04:51:29.820260 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:29 crc kubenswrapper[4775]: I0321 04:51:29.823378 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2z46g" Mar 21 04:51:32 crc kubenswrapper[4775]: I0321 04:51:32.482964 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:51:32 crc kubenswrapper[4775]: I0321 04:51:32.483285 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:51:33 crc kubenswrapper[4775]: I0321 04:51:33.038179 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nstnr" Mar 21 04:51:33 crc kubenswrapper[4775]: I0321 04:51:33.340185 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:33 crc kubenswrapper[4775]: I0321 04:51:33.344160 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 04:51:34 crc kubenswrapper[4775]: I0321 04:51:34.409775 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b67498df7-tct4k"] Mar 21 04:51:34 crc kubenswrapper[4775]: I0321 04:51:34.409987 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" containerName="controller-manager" containerID="cri-o://c026a9d086ef440232842548092ab223e1b082ca25729b9aaf1abf4ff41201dd" gracePeriod=30 Mar 21 04:51:34 crc kubenswrapper[4775]: I0321 04:51:34.419092 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx"] Mar 21 04:51:34 crc kubenswrapper[4775]: I0321 04:51:34.419369 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerName="route-controller-manager" containerID="cri-o://adb4a7b613a2923551c3395853cd07c6da93648bbb3bbab0261ec4ce5d8387a6" gracePeriod=30 Mar 21 04:51:35 crc kubenswrapper[4775]: I0321 04:51:35.220320 4775 generic.go:334] "Generic (PLEG): container finished" podID="1cde2211-4b17-48c3-a414-d931756a8aed" containerID="c026a9d086ef440232842548092ab223e1b082ca25729b9aaf1abf4ff41201dd" exitCode=0 Mar 21 04:51:35 crc kubenswrapper[4775]: I0321 04:51:35.220412 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" event={"ID":"1cde2211-4b17-48c3-a414-d931756a8aed","Type":"ContainerDied","Data":"c026a9d086ef440232842548092ab223e1b082ca25729b9aaf1abf4ff41201dd"} Mar 21 04:51:35 crc kubenswrapper[4775]: I0321 04:51:35.222006 4775 generic.go:334] "Generic (PLEG): container finished" podID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerID="adb4a7b613a2923551c3395853cd07c6da93648bbb3bbab0261ec4ce5d8387a6" exitCode=0 Mar 21 04:51:35 crc kubenswrapper[4775]: I0321 04:51:35.222050 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" event={"ID":"cff31a00-04ec-4558-963b-58851e3f2a4b","Type":"ContainerDied","Data":"adb4a7b613a2923551c3395853cd07c6da93648bbb3bbab0261ec4ce5d8387a6"} Mar 21 04:51:36 crc kubenswrapper[4775]: I0321 04:51:36.296658 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:51:39 crc kubenswrapper[4775]: I0321 04:51:39.217031 4775 patch_prober.go:28] interesting pod/controller-manager-b67498df7-tct4k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 21 04:51:39 crc kubenswrapper[4775]: I0321 04:51:39.217140 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 21 04:51:39 crc kubenswrapper[4775]: I0321 04:51:39.226990 4775 patch_prober.go:28] interesting pod/route-controller-manager-84c957496c-2zjbx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 21 04:51:39 crc kubenswrapper[4775]: I0321 04:51:39.227055 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 21 04:51:40 crc kubenswrapper[4775]: I0321 04:51:40.685520 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:51:43 crc kubenswrapper[4775]: I0321 04:51:43.708857 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mbgvc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.217361 4775 patch_prober.go:28] interesting pod/controller-manager-b67498df7-tct4k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.219437 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.227244 4775 patch_prober.go:28] interesting pod/route-controller-manager-84c957496c-2zjbx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.227306 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.370150 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:51:49 crc kubenswrapper[4775]: E0321 04:51:49.370663 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd25c8a4-8047-4602-a95b-3308af65bd38" containerName="oc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.370701 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd25c8a4-8047-4602-a95b-3308af65bd38" containerName="oc" Mar 21 04:51:49 crc kubenswrapper[4775]: E0321 04:51:49.370725 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1cf896-5a0b-43e5-b36b-ba705e02825f" containerName="pruner" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.370737 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1cf896-5a0b-43e5-b36b-ba705e02825f" containerName="pruner" Mar 21 04:51:49 crc kubenswrapper[4775]: E0321 04:51:49.370761 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829c993f-4963-416a-8d66-15cf19a68237" containerName="pruner" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.370774 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="829c993f-4963-416a-8d66-15cf19a68237" containerName="pruner" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.373304 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f1cf896-5a0b-43e5-b36b-ba705e02825f" containerName="pruner" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.373403 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="829c993f-4963-416a-8d66-15cf19a68237" containerName="pruner" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.373424 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd25c8a4-8047-4602-a95b-3308af65bd38" containerName="oc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.383150 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.383437 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.386494 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.386880 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.521255 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6977ba5-d878-4457-a6d4-8acd42ebb089-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.521510 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6977ba5-d878-4457-a6d4-8acd42ebb089-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.622770 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6977ba5-d878-4457-a6d4-8acd42ebb089-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.622870 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6977ba5-d878-4457-a6d4-8acd42ebb089-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.622943 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6977ba5-d878-4457-a6d4-8acd42ebb089-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.647151 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6977ba5-d878-4457-a6d4-8acd42ebb089-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:49 crc kubenswrapper[4775]: I0321 04:51:49.709667 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.758563 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.759839 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.772265 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.888371 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-var-lock\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.888813 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.889073 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc0e122c-1775-4bf1-9025-6288c383b3f2-kube-api-access\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.990072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc0e122c-1775-4bf1-9025-6288c383b3f2-kube-api-access\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.990420 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-var-lock\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.990528 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-var-lock\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.990677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:54 crc kubenswrapper[4775]: I0321 04:51:54.990790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:55 crc kubenswrapper[4775]: I0321 04:51:55.009924 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc0e122c-1775-4bf1-9025-6288c383b3f2-kube-api-access\") pod \"installer-9-crc\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:55 crc kubenswrapper[4775]: I0321 04:51:55.083447 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:51:55 crc kubenswrapper[4775]: E0321 04:51:55.739817 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:51:55 crc kubenswrapper[4775]: E0321 04:51:55.740398 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98dgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v5fgf_openshift-marketplace(de54a48a-b733-4042-80b6-ecc719712314): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:51:55 crc kubenswrapper[4775]: E0321 04:51:55.741562 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v5fgf" podUID="de54a48a-b733-4042-80b6-ecc719712314" Mar 21 04:51:59 crc kubenswrapper[4775]: E0321 04:51:59.822135 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v5fgf" podUID="de54a48a-b733-4042-80b6-ecc719712314" Mar 21 04:51:59 crc kubenswrapper[4775]: E0321 04:51:59.912005 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:51:59 crc kubenswrapper[4775]: E0321 04:51:59.912192 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdggw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cfvcz_openshift-marketplace(96069bfd-088a-4053-ab37-76a04683a6a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:51:59 crc kubenswrapper[4775]: E0321 04:51:59.913392 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cfvcz" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.133109 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567812-6wppn"] Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.135525 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-6wppn" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.137447 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.138239 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.138408 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.139754 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-6wppn"] Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.153850 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dh2d\" (UniqueName: \"kubernetes.io/projected/d8101654-10fd-404b-ae0f-a098719418f4-kube-api-access-5dh2d\") pod \"auto-csr-approver-29567812-6wppn\" (UID: \"d8101654-10fd-404b-ae0f-a098719418f4\") " pod="openshift-infra/auto-csr-approver-29567812-6wppn" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.217716 4775 patch_prober.go:28] interesting pod/controller-manager-b67498df7-tct4k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.217784 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.227168 4775 patch_prober.go:28] interesting pod/route-controller-manager-84c957496c-2zjbx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.227213 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.255601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dh2d\" (UniqueName: \"kubernetes.io/projected/d8101654-10fd-404b-ae0f-a098719418f4-kube-api-access-5dh2d\") pod \"auto-csr-approver-29567812-6wppn\" (UID: \"d8101654-10fd-404b-ae0f-a098719418f4\") " pod="openshift-infra/auto-csr-approver-29567812-6wppn" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.279396 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dh2d\" (UniqueName: \"kubernetes.io/projected/d8101654-10fd-404b-ae0f-a098719418f4-kube-api-access-5dh2d\") pod \"auto-csr-approver-29567812-6wppn\" (UID: \"d8101654-10fd-404b-ae0f-a098719418f4\") " pod="openshift-infra/auto-csr-approver-29567812-6wppn" Mar 21 04:52:00 crc kubenswrapper[4775]: I0321 04:52:00.458197 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-6wppn" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.485051 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cfvcz" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.513400 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.513587 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpppf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-wjfwd_openshift-marketplace(70ad413a-5f81-4094-b2d8-9b89698c6e32): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.514840 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-wjfwd" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.566378 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.566682 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92fzw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wbtgv_openshift-marketplace(ccb910ab-dcef-4523-81df-c0fb5eb83429): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.568413 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wbtgv" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.579462 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.579646 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ww6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-959kj_openshift-marketplace(5bb65dee-cd5f-46b3-9e7d-36e5d182d19e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:52:01 crc kubenswrapper[4775]: E0321 04:52:01.580828 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-959kj" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" Mar 21 04:52:02 crc kubenswrapper[4775]: I0321 04:52:02.482040 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:52:02 crc kubenswrapper[4775]: I0321 04:52:02.482455 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:52:02 crc kubenswrapper[4775]: I0321 04:52:02.482513 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:52:02 crc kubenswrapper[4775]: I0321 04:52:02.483167 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:52:02 crc kubenswrapper[4775]: I0321 04:52:02.483234 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee" gracePeriod=600 Mar 21 04:52:03 crc kubenswrapper[4775]: E0321 04:52:03.144504 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-wjfwd" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" Mar 21 04:52:03 crc kubenswrapper[4775]: E0321 04:52:03.144504 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wbtgv" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" Mar 21 04:52:03 crc kubenswrapper[4775]: E0321 04:52:03.144532 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-959kj" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.293841 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.302317 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-client-ca\") pod \"cff31a00-04ec-4558-963b-58851e3f2a4b\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.302389 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cff31a00-04ec-4558-963b-58851e3f2a4b-serving-cert\") pod \"cff31a00-04ec-4558-963b-58851e3f2a4b\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.302429 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcgzs\" (UniqueName: \"kubernetes.io/projected/cff31a00-04ec-4558-963b-58851e3f2a4b-kube-api-access-rcgzs\") pod \"cff31a00-04ec-4558-963b-58851e3f2a4b\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.302473 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-config\") pod \"cff31a00-04ec-4558-963b-58851e3f2a4b\" (UID: \"cff31a00-04ec-4558-963b-58851e3f2a4b\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.304027 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-client-ca" (OuterVolumeSpecName: "client-ca") pod "cff31a00-04ec-4558-963b-58851e3f2a4b" (UID: "cff31a00-04ec-4558-963b-58851e3f2a4b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.305779 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-config" (OuterVolumeSpecName: "config") pod "cff31a00-04ec-4558-963b-58851e3f2a4b" (UID: "cff31a00-04ec-4558-963b-58851e3f2a4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.314900 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff31a00-04ec-4558-963b-58851e3f2a4b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cff31a00-04ec-4558-963b-58851e3f2a4b" (UID: "cff31a00-04ec-4558-963b-58851e3f2a4b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.332637 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.334225 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff31a00-04ec-4558-963b-58851e3f2a4b-kube-api-access-rcgzs" (OuterVolumeSpecName: "kube-api-access-rcgzs") pod "cff31a00-04ec-4558-963b-58851e3f2a4b" (UID: "cff31a00-04ec-4558-963b-58851e3f2a4b"). InnerVolumeSpecName "kube-api-access-rcgzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.336515 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8"] Mar 21 04:52:03 crc kubenswrapper[4775]: E0321 04:52:03.336834 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" containerName="controller-manager" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.336851 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" containerName="controller-manager" Mar 21 04:52:03 crc kubenswrapper[4775]: E0321 04:52:03.336866 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerName="route-controller-manager" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.336872 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerName="route-controller-manager" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.336980 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" containerName="route-controller-manager" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.336995 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" containerName="controller-manager" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.337682 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.348442 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8"] Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.393928 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" event={"ID":"1cde2211-4b17-48c3-a414-d931756a8aed","Type":"ContainerDied","Data":"26e33d93d40acbae83163a34763d11a2400dc57b95504cce6365dbca75d924d3"} Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.393976 4775 scope.go:117] "RemoveContainer" containerID="c026a9d086ef440232842548092ab223e1b082ca25729b9aaf1abf4ff41201dd" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.394072 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b67498df7-tct4k" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.399982 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee" exitCode=0 Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.400048 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee"} Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.402968 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cde2211-4b17-48c3-a414-d931756a8aed-serving-cert\") pod \"1cde2211-4b17-48c3-a414-d931756a8aed\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-proxy-ca-bundles\") pod \"1cde2211-4b17-48c3-a414-d931756a8aed\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-client-ca\") pod \"1cde2211-4b17-48c3-a414-d931756a8aed\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403313 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr8fr\" (UniqueName: \"kubernetes.io/projected/1cde2211-4b17-48c3-a414-d931756a8aed-kube-api-access-pr8fr\") pod \"1cde2211-4b17-48c3-a414-d931756a8aed\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403372 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-config\") pod \"1cde2211-4b17-48c3-a414-d931756a8aed\" (UID: \"1cde2211-4b17-48c3-a414-d931756a8aed\") " Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403494 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-client-ca\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403530 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hc9l\" (UniqueName: \"kubernetes.io/projected/ef0e747e-48cd-4a7d-922d-905e15ea750e-kube-api-access-9hc9l\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403552 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0e747e-48cd-4a7d-922d-905e15ea750e-serving-cert\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-config\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403685 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403698 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cff31a00-04ec-4558-963b-58851e3f2a4b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403707 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcgzs\" (UniqueName: \"kubernetes.io/projected/cff31a00-04ec-4558-963b-58851e3f2a4b-kube-api-access-rcgzs\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.403716 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cff31a00-04ec-4558-963b-58851e3f2a4b-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.404252 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-client-ca" (OuterVolumeSpecName: "client-ca") pod "1cde2211-4b17-48c3-a414-d931756a8aed" (UID: "1cde2211-4b17-48c3-a414-d931756a8aed"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.404500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1cde2211-4b17-48c3-a414-d931756a8aed" (UID: "1cde2211-4b17-48c3-a414-d931756a8aed"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.405232 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-config" (OuterVolumeSpecName: "config") pod "1cde2211-4b17-48c3-a414-d931756a8aed" (UID: "1cde2211-4b17-48c3-a414-d931756a8aed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.405953 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" event={"ID":"cff31a00-04ec-4558-963b-58851e3f2a4b","Type":"ContainerDied","Data":"3bb4a60ce62110a0343ce33bf20b360542634587af7c8fc08808e3e39e831dc8"} Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.406047 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.407599 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cde2211-4b17-48c3-a414-d931756a8aed-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1cde2211-4b17-48c3-a414-d931756a8aed" (UID: "1cde2211-4b17-48c3-a414-d931756a8aed"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.408903 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cde2211-4b17-48c3-a414-d931756a8aed-kube-api-access-pr8fr" (OuterVolumeSpecName: "kube-api-access-pr8fr") pod "1cde2211-4b17-48c3-a414-d931756a8aed" (UID: "1cde2211-4b17-48c3-a414-d931756a8aed"). InnerVolumeSpecName "kube-api-access-pr8fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.425342 4775 scope.go:117] "RemoveContainer" containerID="adb4a7b613a2923551c3395853cd07c6da93648bbb3bbab0261ec4ce5d8387a6" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.435146 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx"] Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.438542 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c957496c-2zjbx"] Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-client-ca\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504630 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hc9l\" (UniqueName: \"kubernetes.io/projected/ef0e747e-48cd-4a7d-922d-905e15ea750e-kube-api-access-9hc9l\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0e747e-48cd-4a7d-922d-905e15ea750e-serving-cert\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504690 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-config\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504783 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504794 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1cde2211-4b17-48c3-a414-d931756a8aed-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504806 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504815 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1cde2211-4b17-48c3-a414-d931756a8aed-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.504826 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr8fr\" (UniqueName: \"kubernetes.io/projected/1cde2211-4b17-48c3-a414-d931756a8aed-kube-api-access-pr8fr\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.506204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-config\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.506507 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-client-ca\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.508588 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0e747e-48cd-4a7d-922d-905e15ea750e-serving-cert\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.525853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hc9l\" (UniqueName: \"kubernetes.io/projected/ef0e747e-48cd-4a7d-922d-905e15ea750e-kube-api-access-9hc9l\") pod \"route-controller-manager-7fbcc6866-hpvz8\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.645502 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.655901 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:52:03 crc kubenswrapper[4775]: W0321 04:52:03.658711 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc0e122c_1775_4bf1_9025_6288c383b3f2.slice/crio-df0920ca1c2002e3d0e08d9fcffa648209bafd25e83252350dced0f61be9bf09 WatchSource:0}: Error finding container df0920ca1c2002e3d0e08d9fcffa648209bafd25e83252350dced0f61be9bf09: Status 404 returned error can't find the container with id df0920ca1c2002e3d0e08d9fcffa648209bafd25e83252350dced0f61be9bf09 Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.667568 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff31a00-04ec-4558-963b-58851e3f2a4b" path="/var/lib/kubelet/pods/cff31a00-04ec-4558-963b-58851e3f2a4b/volumes" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.670335 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.711207 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b67498df7-tct4k"] Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.714787 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b67498df7-tct4k"] Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.724906 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-6wppn"] Mar 21 04:52:03 crc kubenswrapper[4775]: W0321 04:52:03.736943 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8101654_10fd_404b_ae0f_a098719418f4.slice/crio-b4836b0688f8984a0812fbf1a29d1435aec6aa3f8171f16c027c71eb201157a9 WatchSource:0}: Error finding container b4836b0688f8984a0812fbf1a29d1435aec6aa3f8171f16c027c71eb201157a9: Status 404 returned error can't find the container with id b4836b0688f8984a0812fbf1a29d1435aec6aa3f8171f16c027c71eb201157a9 Mar 21 04:52:03 crc kubenswrapper[4775]: I0321 04:52:03.899152 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8"] Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.373332 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.373761 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mm7zf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7k9df_openshift-marketplace(55e2733e-3620-4ecb-a51e-33b1fd3dce9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.374652 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.374941 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2gdv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hldm7_openshift-marketplace(571e84f2-a2bc-4f09-ac53-d4a4adafa80b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.374953 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7k9df" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.377475 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hldm7" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.412840 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc0e122c-1775-4bf1-9025-6288c383b3f2","Type":"ContainerStarted","Data":"de0bb723a8e5a2f1ecac5f9d70f09000ecc4cbf0b2ce82838fe0f1932e47f097"} Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.412888 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc0e122c-1775-4bf1-9025-6288c383b3f2","Type":"ContainerStarted","Data":"df0920ca1c2002e3d0e08d9fcffa648209bafd25e83252350dced0f61be9bf09"} Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.414921 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d6977ba5-d878-4457-a6d4-8acd42ebb089","Type":"ContainerStarted","Data":"49f03f37f3492994f82774da13ca12406fd7976bb8941707a7ec3e837116fcba"} Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.414957 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d6977ba5-d878-4457-a6d4-8acd42ebb089","Type":"ContainerStarted","Data":"bb2095279e346ad9805b3d902d26b2c1ec43b438031bbbe2a123459d83839e23"} Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.416628 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-6wppn" event={"ID":"d8101654-10fd-404b-ae0f-a098719418f4","Type":"ContainerStarted","Data":"b4836b0688f8984a0812fbf1a29d1435aec6aa3f8171f16c027c71eb201157a9"} Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.418560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"bce245e457399f7eaba83f68d2114b3ba9e57f5a921fcddf334a4766f16c7398"} Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.420489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" event={"ID":"ef0e747e-48cd-4a7d-922d-905e15ea750e","Type":"ContainerStarted","Data":"2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b"} Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.420530 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" event={"ID":"ef0e747e-48cd-4a7d-922d-905e15ea750e","Type":"ContainerStarted","Data":"f1c5e9ed26c65634fec4b7e068160478cde4bffe6efaa83fc8c27834b2f650f2"} Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.422470 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hldm7" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.422471 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7k9df" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.432978 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=10.432957254 podStartE2EDuration="10.432957254s" podCreationTimestamp="2026-03-21 04:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:52:04.430425181 +0000 UTC m=+277.406888825" watchObservedRunningTime="2026-03-21 04:52:04.432957254 +0000 UTC m=+277.409420888" Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.463134 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=15.463101735 podStartE2EDuration="15.463101735s" podCreationTimestamp="2026-03-21 04:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:52:04.461307743 +0000 UTC m=+277.437771377" watchObservedRunningTime="2026-03-21 04:52:04.463101735 +0000 UTC m=+277.439565359" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.480789 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.480924 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wq48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-svf9w_openshift-marketplace(1dcbca72-150a-47c6-ac3c-f701ae82e05b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:52:04 crc kubenswrapper[4775]: E0321 04:52:04.482273 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-svf9w" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" Mar 21 04:52:04 crc kubenswrapper[4775]: I0321 04:52:04.483444 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" podStartSLOduration=10.483431752 podStartE2EDuration="10.483431752s" podCreationTimestamp="2026-03-21 04:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:52:04.480318702 +0000 UTC m=+277.456782326" watchObservedRunningTime="2026-03-21 04:52:04.483431752 +0000 UTC m=+277.459895386" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.427992 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8101654-10fd-404b-ae0f-a098719418f4" containerID="4cd8fe98e0e79605364ea6850bbec4e960085c0c48458cb67fb164296fe16045" exitCode=0 Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.428033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-6wppn" event={"ID":"d8101654-10fd-404b-ae0f-a098719418f4","Type":"ContainerDied","Data":"4cd8fe98e0e79605364ea6850bbec4e960085c0c48458cb67fb164296fe16045"} Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.429760 4775 generic.go:334] "Generic (PLEG): container finished" podID="d6977ba5-d878-4457-a6d4-8acd42ebb089" containerID="49f03f37f3492994f82774da13ca12406fd7976bb8941707a7ec3e837116fcba" exitCode=0 Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.429890 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d6977ba5-d878-4457-a6d4-8acd42ebb089","Type":"ContainerDied","Data":"49f03f37f3492994f82774da13ca12406fd7976bb8941707a7ec3e837116fcba"} Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.430078 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:05 crc kubenswrapper[4775]: E0321 04:52:05.432717 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-svf9w" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.437243 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.667864 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cde2211-4b17-48c3-a414-d931756a8aed" path="/var/lib/kubelet/pods/1cde2211-4b17-48c3-a414-d931756a8aed/volumes" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.946206 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bd4667b97-z2cn5"] Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.946874 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.948739 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.949026 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.952380 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.952616 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.953237 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.953494 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.957812 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:52:05 crc kubenswrapper[4775]: I0321 04:52:05.958882 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bd4667b97-z2cn5"] Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.139306 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6s67\" (UniqueName: \"kubernetes.io/projected/05791985-4754-4e1e-9400-7204c6a5eab2-kube-api-access-l6s67\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.139352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-config\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.139384 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05791985-4754-4e1e-9400-7204c6a5eab2-serving-cert\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.139412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-client-ca\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.139444 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-proxy-ca-bundles\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.254793 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6s67\" (UniqueName: \"kubernetes.io/projected/05791985-4754-4e1e-9400-7204c6a5eab2-kube-api-access-l6s67\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.254875 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-config\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.254947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05791985-4754-4e1e-9400-7204c6a5eab2-serving-cert\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.255022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-client-ca\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.255168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-proxy-ca-bundles\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.256837 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-client-ca\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.256896 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-proxy-ca-bundles\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.257213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-config\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.262649 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05791985-4754-4e1e-9400-7204c6a5eab2-serving-cert\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.275061 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6s67\" (UniqueName: \"kubernetes.io/projected/05791985-4754-4e1e-9400-7204c6a5eab2-kube-api-access-l6s67\") pod \"controller-manager-7bd4667b97-z2cn5\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.572446 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.595483 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd86b"] Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.741907 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.764048 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6977ba5-d878-4457-a6d4-8acd42ebb089-kubelet-dir\") pod \"d6977ba5-d878-4457-a6d4-8acd42ebb089\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.764127 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6977ba5-d878-4457-a6d4-8acd42ebb089-kube-api-access\") pod \"d6977ba5-d878-4457-a6d4-8acd42ebb089\" (UID: \"d6977ba5-d878-4457-a6d4-8acd42ebb089\") " Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.766348 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6977ba5-d878-4457-a6d4-8acd42ebb089-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d6977ba5-d878-4457-a6d4-8acd42ebb089" (UID: "d6977ba5-d878-4457-a6d4-8acd42ebb089"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.776281 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6977ba5-d878-4457-a6d4-8acd42ebb089-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d6977ba5-d878-4457-a6d4-8acd42ebb089" (UID: "d6977ba5-d878-4457-a6d4-8acd42ebb089"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.849013 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-6wppn" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.865822 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6977ba5-d878-4457-a6d4-8acd42ebb089-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.865857 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6977ba5-d878-4457-a6d4-8acd42ebb089-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.967184 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dh2d\" (UniqueName: \"kubernetes.io/projected/d8101654-10fd-404b-ae0f-a098719418f4-kube-api-access-5dh2d\") pod \"d8101654-10fd-404b-ae0f-a098719418f4\" (UID: \"d8101654-10fd-404b-ae0f-a098719418f4\") " Mar 21 04:52:06 crc kubenswrapper[4775]: I0321 04:52:06.970620 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8101654-10fd-404b-ae0f-a098719418f4-kube-api-access-5dh2d" (OuterVolumeSpecName: "kube-api-access-5dh2d") pod "d8101654-10fd-404b-ae0f-a098719418f4" (UID: "d8101654-10fd-404b-ae0f-a098719418f4"). InnerVolumeSpecName "kube-api-access-5dh2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.068489 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dh2d\" (UniqueName: \"kubernetes.io/projected/d8101654-10fd-404b-ae0f-a098719418f4-kube-api-access-5dh2d\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.162927 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bd4667b97-z2cn5"] Mar 21 04:52:07 crc kubenswrapper[4775]: W0321 04:52:07.173478 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05791985_4754_4e1e_9400_7204c6a5eab2.slice/crio-1b352205937c3361b5c47711682f5ba564f7c8d78431497ed52ef09e4be45784 WatchSource:0}: Error finding container 1b352205937c3361b5c47711682f5ba564f7c8d78431497ed52ef09e4be45784: Status 404 returned error can't find the container with id 1b352205937c3361b5c47711682f5ba564f7c8d78431497ed52ef09e4be45784 Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.441637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d6977ba5-d878-4457-a6d4-8acd42ebb089","Type":"ContainerDied","Data":"bb2095279e346ad9805b3d902d26b2c1ec43b438031bbbe2a123459d83839e23"} Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.441681 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb2095279e346ad9805b3d902d26b2c1ec43b438031bbbe2a123459d83839e23" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.441742 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.446505 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-6wppn" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.446504 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-6wppn" event={"ID":"d8101654-10fd-404b-ae0f-a098719418f4","Type":"ContainerDied","Data":"b4836b0688f8984a0812fbf1a29d1435aec6aa3f8171f16c027c71eb201157a9"} Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.446639 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4836b0688f8984a0812fbf1a29d1435aec6aa3f8171f16c027c71eb201157a9" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.449359 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" event={"ID":"05791985-4754-4e1e-9400-7204c6a5eab2","Type":"ContainerStarted","Data":"5494d63ffe4e008a81f5adf939d712754549a2efc64457da115e33ba5aea120c"} Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.449401 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" event={"ID":"05791985-4754-4e1e-9400-7204c6a5eab2","Type":"ContainerStarted","Data":"1b352205937c3361b5c47711682f5ba564f7c8d78431497ed52ef09e4be45784"} Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.449743 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.468433 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:07 crc kubenswrapper[4775]: I0321 04:52:07.521656 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" podStartSLOduration=13.521634875 podStartE2EDuration="13.521634875s" podCreationTimestamp="2026-03-21 04:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:52:07.488276732 +0000 UTC m=+280.464740356" watchObservedRunningTime="2026-03-21 04:52:07.521634875 +0000 UTC m=+280.498098499" Mar 21 04:52:16 crc kubenswrapper[4775]: I0321 04:52:16.271474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfvcz" event={"ID":"96069bfd-088a-4053-ab37-76a04683a6a6","Type":"ContainerStarted","Data":"86d840f2a45ca8b3bb3a23817480b80390f3ea92771806965d547cb2a92d185e"} Mar 21 04:52:16 crc kubenswrapper[4775]: I0321 04:52:16.273157 4775 generic.go:334] "Generic (PLEG): container finished" podID="de54a48a-b733-4042-80b6-ecc719712314" containerID="661b6896c2a4aa43857ae9de8bb6472c19fadc3bd2c01a766c40cd5ed12ea550" exitCode=0 Mar 21 04:52:16 crc kubenswrapper[4775]: I0321 04:52:16.273187 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5fgf" event={"ID":"de54a48a-b733-4042-80b6-ecc719712314","Type":"ContainerDied","Data":"661b6896c2a4aa43857ae9de8bb6472c19fadc3bd2c01a766c40cd5ed12ea550"} Mar 21 04:52:17 crc kubenswrapper[4775]: I0321 04:52:17.280972 4775 generic.go:334] "Generic (PLEG): container finished" podID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerID="8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc" exitCode=0 Mar 21 04:52:17 crc kubenswrapper[4775]: I0321 04:52:17.281347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-959kj" event={"ID":"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e","Type":"ContainerDied","Data":"8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc"} Mar 21 04:52:17 crc kubenswrapper[4775]: I0321 04:52:17.284803 4775 generic.go:334] "Generic (PLEG): container finished" podID="96069bfd-088a-4053-ab37-76a04683a6a6" containerID="86d840f2a45ca8b3bb3a23817480b80390f3ea92771806965d547cb2a92d185e" exitCode=0 Mar 21 04:52:17 crc kubenswrapper[4775]: I0321 04:52:17.284868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfvcz" event={"ID":"96069bfd-088a-4053-ab37-76a04683a6a6","Type":"ContainerDied","Data":"86d840f2a45ca8b3bb3a23817480b80390f3ea92771806965d547cb2a92d185e"} Mar 21 04:52:17 crc kubenswrapper[4775]: I0321 04:52:17.288787 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5fgf" event={"ID":"de54a48a-b733-4042-80b6-ecc719712314","Type":"ContainerStarted","Data":"e9e73c129b659ac9d1d7ebb1db8f0b40547803719861caf1d1794c7a799f2a15"} Mar 21 04:52:17 crc kubenswrapper[4775]: I0321 04:52:17.691368 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v5fgf" podStartSLOduration=3.971158257 podStartE2EDuration="1m3.691348032s" podCreationTimestamp="2026-03-21 04:51:14 +0000 UTC" firstStartedPulling="2026-03-21 04:51:16.905813036 +0000 UTC m=+229.882276670" lastFinishedPulling="2026-03-21 04:52:16.626002821 +0000 UTC m=+289.602466445" observedRunningTime="2026-03-21 04:52:17.337404812 +0000 UTC m=+290.313868436" watchObservedRunningTime="2026-03-21 04:52:17.691348032 +0000 UTC m=+290.667811656" Mar 21 04:52:18 crc kubenswrapper[4775]: I0321 04:52:18.303155 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfvcz" event={"ID":"96069bfd-088a-4053-ab37-76a04683a6a6","Type":"ContainerStarted","Data":"5448b11851da85cc3503470566b1ac0017e53d4c7a81182921503467a33e3be5"} Mar 21 04:52:18 crc kubenswrapper[4775]: I0321 04:52:18.308517 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-959kj" event={"ID":"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e","Type":"ContainerStarted","Data":"aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96"} Mar 21 04:52:18 crc kubenswrapper[4775]: I0321 04:52:18.321654 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cfvcz" podStartSLOduration=8.546578343 podStartE2EDuration="1m1.321636341s" podCreationTimestamp="2026-03-21 04:51:17 +0000 UTC" firstStartedPulling="2026-03-21 04:51:24.993965317 +0000 UTC m=+237.970428941" lastFinishedPulling="2026-03-21 04:52:17.769023315 +0000 UTC m=+290.745486939" observedRunningTime="2026-03-21 04:52:18.321402684 +0000 UTC m=+291.297866318" watchObservedRunningTime="2026-03-21 04:52:18.321636341 +0000 UTC m=+291.298099955" Mar 21 04:52:18 crc kubenswrapper[4775]: I0321 04:52:18.336423 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-959kj" podStartSLOduration=4.503403423 podStartE2EDuration="1m5.336406837s" podCreationTimestamp="2026-03-21 04:51:13 +0000 UTC" firstStartedPulling="2026-03-21 04:51:16.821184955 +0000 UTC m=+229.797648579" lastFinishedPulling="2026-03-21 04:52:17.654188369 +0000 UTC m=+290.630651993" observedRunningTime="2026-03-21 04:52:18.335774399 +0000 UTC m=+291.312238033" watchObservedRunningTime="2026-03-21 04:52:18.336406837 +0000 UTC m=+291.312870461" Mar 21 04:52:19 crc kubenswrapper[4775]: I0321 04:52:19.325500 4775 generic.go:334] "Generic (PLEG): container finished" podID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerID="a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a" exitCode=0 Mar 21 04:52:19 crc kubenswrapper[4775]: I0321 04:52:19.325540 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hldm7" event={"ID":"571e84f2-a2bc-4f09-ac53-d4a4adafa80b","Type":"ContainerDied","Data":"a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a"} Mar 21 04:52:19 crc kubenswrapper[4775]: I0321 04:52:19.328258 4775 generic.go:334] "Generic (PLEG): container finished" podID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerID="21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d" exitCode=0 Mar 21 04:52:19 crc kubenswrapper[4775]: I0321 04:52:19.328343 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbtgv" event={"ID":"ccb910ab-dcef-4523-81df-c0fb5eb83429","Type":"ContainerDied","Data":"21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d"} Mar 21 04:52:24 crc kubenswrapper[4775]: I0321 04:52:24.645155 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:52:24 crc kubenswrapper[4775]: I0321 04:52:24.646261 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:52:24 crc kubenswrapper[4775]: I0321 04:52:24.849079 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:52:24 crc kubenswrapper[4775]: I0321 04:52:24.849148 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:52:25 crc kubenswrapper[4775]: I0321 04:52:25.373618 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:52:25 crc kubenswrapper[4775]: I0321 04:52:25.375772 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:52:25 crc kubenswrapper[4775]: I0321 04:52:25.423254 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:52:25 crc kubenswrapper[4775]: I0321 04:52:25.437751 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:52:25 crc kubenswrapper[4775]: I0321 04:52:25.890403 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v5fgf"] Mar 21 04:52:27 crc kubenswrapper[4775]: I0321 04:52:27.373298 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v5fgf" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="registry-server" containerID="cri-o://e9e73c129b659ac9d1d7ebb1db8f0b40547803719861caf1d1794c7a799f2a15" gracePeriod=2 Mar 21 04:52:27 crc kubenswrapper[4775]: I0321 04:52:27.720949 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:52:27 crc kubenswrapper[4775]: I0321 04:52:27.721045 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:52:27 crc kubenswrapper[4775]: I0321 04:52:27.779580 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:52:28 crc kubenswrapper[4775]: I0321 04:52:28.418789 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:52:29 crc kubenswrapper[4775]: I0321 04:52:29.385623 4775 generic.go:334] "Generic (PLEG): container finished" podID="de54a48a-b733-4042-80b6-ecc719712314" containerID="e9e73c129b659ac9d1d7ebb1db8f0b40547803719861caf1d1794c7a799f2a15" exitCode=0 Mar 21 04:52:29 crc kubenswrapper[4775]: I0321 04:52:29.385695 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5fgf" event={"ID":"de54a48a-b733-4042-80b6-ecc719712314","Type":"ContainerDied","Data":"e9e73c129b659ac9d1d7ebb1db8f0b40547803719861caf1d1794c7a799f2a15"} Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.095632 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfvcz"] Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.392141 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cfvcz" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="registry-server" containerID="cri-o://5448b11851da85cc3503470566b1ac0017e53d4c7a81182921503467a33e3be5" gracePeriod=2 Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.739582 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.815560 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-catalog-content\") pod \"de54a48a-b733-4042-80b6-ecc719712314\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.815639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98dgw\" (UniqueName: \"kubernetes.io/projected/de54a48a-b733-4042-80b6-ecc719712314-kube-api-access-98dgw\") pod \"de54a48a-b733-4042-80b6-ecc719712314\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.815769 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-utilities\") pod \"de54a48a-b733-4042-80b6-ecc719712314\" (UID: \"de54a48a-b733-4042-80b6-ecc719712314\") " Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.816726 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-utilities" (OuterVolumeSpecName: "utilities") pod "de54a48a-b733-4042-80b6-ecc719712314" (UID: "de54a48a-b733-4042-80b6-ecc719712314"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.823079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de54a48a-b733-4042-80b6-ecc719712314-kube-api-access-98dgw" (OuterVolumeSpecName: "kube-api-access-98dgw") pod "de54a48a-b733-4042-80b6-ecc719712314" (UID: "de54a48a-b733-4042-80b6-ecc719712314"). InnerVolumeSpecName "kube-api-access-98dgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.916534 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98dgw\" (UniqueName: \"kubernetes.io/projected/de54a48a-b733-4042-80b6-ecc719712314-kube-api-access-98dgw\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:30 crc kubenswrapper[4775]: I0321 04:52:30.916569 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.371271 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de54a48a-b733-4042-80b6-ecc719712314" (UID: "de54a48a-b733-4042-80b6-ecc719712314"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.401163 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v5fgf" event={"ID":"de54a48a-b733-4042-80b6-ecc719712314","Type":"ContainerDied","Data":"ade6fa22510b087e9954c736cf8d4e9716ee7ec2fc194bbd254a73a9267ce630"} Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.401264 4775 scope.go:117] "RemoveContainer" containerID="e9e73c129b659ac9d1d7ebb1db8f0b40547803719861caf1d1794c7a799f2a15" Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.401429 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v5fgf" Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.404009 4775 generic.go:334] "Generic (PLEG): container finished" podID="96069bfd-088a-4053-ab37-76a04683a6a6" containerID="5448b11851da85cc3503470566b1ac0017e53d4c7a81182921503467a33e3be5" exitCode=0 Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.404040 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfvcz" event={"ID":"96069bfd-088a-4053-ab37-76a04683a6a6","Type":"ContainerDied","Data":"5448b11851da85cc3503470566b1ac0017e53d4c7a81182921503467a33e3be5"} Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.420666 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de54a48a-b733-4042-80b6-ecc719712314-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.436639 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v5fgf"] Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.439242 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v5fgf"] Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.626008 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerName="oauth-openshift" containerID="cri-o://e7db5d2af2536f63217dcf1801711995965b013eb5a3eabb8f70b3e311b18d76" gracePeriod=15 Mar 21 04:52:31 crc kubenswrapper[4775]: I0321 04:52:31.671014 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de54a48a-b733-4042-80b6-ecc719712314" path="/var/lib/kubelet/pods/de54a48a-b733-4042-80b6-ecc719712314/volumes" Mar 21 04:52:33 crc kubenswrapper[4775]: I0321 04:52:33.211276 4775 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jd86b container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Mar 21 04:52:33 crc kubenswrapper[4775]: I0321 04:52:33.211351 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Mar 21 04:52:34 crc kubenswrapper[4775]: I0321 04:52:34.388994 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bd4667b97-z2cn5"] Mar 21 04:52:34 crc kubenswrapper[4775]: I0321 04:52:34.389531 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" podUID="05791985-4754-4e1e-9400-7204c6a5eab2" containerName="controller-manager" containerID="cri-o://5494d63ffe4e008a81f5adf939d712754549a2efc64457da115e33ba5aea120c" gracePeriod=30 Mar 21 04:52:34 crc kubenswrapper[4775]: I0321 04:52:34.423060 4775 generic.go:334] "Generic (PLEG): container finished" podID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerID="e7db5d2af2536f63217dcf1801711995965b013eb5a3eabb8f70b3e311b18d76" exitCode=0 Mar 21 04:52:34 crc kubenswrapper[4775]: I0321 04:52:34.423107 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" event={"ID":"fddde3da-8512-4e62-9c38-b59f98e117e0","Type":"ContainerDied","Data":"e7db5d2af2536f63217dcf1801711995965b013eb5a3eabb8f70b3e311b18d76"} Mar 21 04:52:34 crc kubenswrapper[4775]: I0321 04:52:34.486925 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8"] Mar 21 04:52:34 crc kubenswrapper[4775]: I0321 04:52:34.487169 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" podUID="ef0e747e-48cd-4a7d-922d-905e15ea750e" containerName="route-controller-manager" containerID="cri-o://2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b" gracePeriod=30 Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.070928 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.077617 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdggw\" (UniqueName: \"kubernetes.io/projected/96069bfd-088a-4053-ab37-76a04683a6a6-kube-api-access-kdggw\") pod \"96069bfd-088a-4053-ab37-76a04683a6a6\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.077787 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-utilities\") pod \"96069bfd-088a-4053-ab37-76a04683a6a6\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.077902 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-catalog-content\") pod \"96069bfd-088a-4053-ab37-76a04683a6a6\" (UID: \"96069bfd-088a-4053-ab37-76a04683a6a6\") " Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.078767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-utilities" (OuterVolumeSpecName: "utilities") pod "96069bfd-088a-4053-ab37-76a04683a6a6" (UID: "96069bfd-088a-4053-ab37-76a04683a6a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.084220 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96069bfd-088a-4053-ab37-76a04683a6a6-kube-api-access-kdggw" (OuterVolumeSpecName: "kube-api-access-kdggw") pod "96069bfd-088a-4053-ab37-76a04683a6a6" (UID: "96069bfd-088a-4053-ab37-76a04683a6a6"). InnerVolumeSpecName "kube-api-access-kdggw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.179043 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.179075 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdggw\" (UniqueName: \"kubernetes.io/projected/96069bfd-088a-4053-ab37-76a04683a6a6-kube-api-access-kdggw\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.219948 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96069bfd-088a-4053-ab37-76a04683a6a6" (UID: "96069bfd-088a-4053-ab37-76a04683a6a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.280553 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96069bfd-088a-4053-ab37-76a04683a6a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.433104 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfvcz" event={"ID":"96069bfd-088a-4053-ab37-76a04683a6a6","Type":"ContainerDied","Data":"778fa4db06d864ab1638a1a9d9660eefb0a2a7f44eb1587a31bef24f86e53ae8"} Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.433237 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfvcz" Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.464912 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfvcz"] Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.471651 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cfvcz"] Mar 21 04:52:35 crc kubenswrapper[4775]: I0321 04:52:35.667525 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" path="/var/lib/kubelet/pods/96069bfd-088a-4053-ab37-76a04683a6a6/volumes" Mar 21 04:52:36 crc kubenswrapper[4775]: I0321 04:52:36.439926 4775 generic.go:334] "Generic (PLEG): container finished" podID="05791985-4754-4e1e-9400-7204c6a5eab2" containerID="5494d63ffe4e008a81f5adf939d712754549a2efc64457da115e33ba5aea120c" exitCode=0 Mar 21 04:52:36 crc kubenswrapper[4775]: I0321 04:52:36.439982 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" event={"ID":"05791985-4754-4e1e-9400-7204c6a5eab2","Type":"ContainerDied","Data":"5494d63ffe4e008a81f5adf939d712754549a2efc64457da115e33ba5aea120c"} Mar 21 04:52:36 crc kubenswrapper[4775]: I0321 04:52:36.575827 4775 patch_prober.go:28] interesting pod/controller-manager-7bd4667b97-z2cn5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Mar 21 04:52:36 crc kubenswrapper[4775]: I0321 04:52:36.575887 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" podUID="05791985-4754-4e1e-9400-7204c6a5eab2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Mar 21 04:52:36 crc kubenswrapper[4775]: I0321 04:52:36.896820 4775 scope.go:117] "RemoveContainer" containerID="661b6896c2a4aa43857ae9de8bb6472c19fadc3bd2c01a766c40cd5ed12ea550" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.215259 4775 scope.go:117] "RemoveContainer" containerID="c42abfff124b04fcc4cb97eb8de832fa3a68e69db6ccb668907d78050544eb6c" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.297509 4775 scope.go:117] "RemoveContainer" containerID="5448b11851da85cc3503470566b1ac0017e53d4c7a81182921503467a33e3be5" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.315787 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.322822 4775 scope.go:117] "RemoveContainer" containerID="86d840f2a45ca8b3bb3a23817480b80390f3ea92771806965d547cb2a92d185e" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.330484 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.376261 4775 scope.go:117] "RemoveContainer" containerID="e2fe2d90884acf37e8aab163e4c967e9d771b2b2c054a404750ee68dc72d022c" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.383879 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.447007 4775 generic.go:334] "Generic (PLEG): container finished" podID="ef0e747e-48cd-4a7d-922d-905e15ea750e" containerID="2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b" exitCode=0 Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.447210 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" event={"ID":"ef0e747e-48cd-4a7d-922d-905e15ea750e","Type":"ContainerDied","Data":"2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b"} Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.447999 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" event={"ID":"ef0e747e-48cd-4a7d-922d-905e15ea750e","Type":"ContainerDied","Data":"f1c5e9ed26c65634fec4b7e068160478cde4bffe6efaa83fc8c27834b2f650f2"} Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.447305 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.448030 4775 scope.go:117] "RemoveContainer" containerID="2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.450861 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.451416 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd86b" event={"ID":"fddde3da-8512-4e62-9c38-b59f98e117e0","Type":"ContainerDied","Data":"bdc921caf55fdc9483dd16d86e33b3865a46d3e8a029b5b72e32afe0666cef10"} Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.455493 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" event={"ID":"05791985-4754-4e1e-9400-7204c6a5eab2","Type":"ContainerDied","Data":"1b352205937c3361b5c47711682f5ba564f7c8d78431497ed52ef09e4be45784"} Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.455560 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bd4667b97-z2cn5" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.462029 4775 scope.go:117] "RemoveContainer" containerID="2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.462411 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b\": container with ID starting with 2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b not found: ID does not exist" containerID="2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.462447 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b"} err="failed to get container status \"2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b\": rpc error: code = NotFound desc = could not find container \"2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b\": container with ID starting with 2dfd1a63e89a57a5a6969d7c49601a2ff07f69ec16bc071f9034c83d8f03b10b not found: ID does not exist" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.462471 4775 scope.go:117] "RemoveContainer" containerID="e7db5d2af2536f63217dcf1801711995965b013eb5a3eabb8f70b3e311b18d76" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.495977 4775 scope.go:117] "RemoveContainer" containerID="5494d63ffe4e008a81f5adf939d712754549a2efc64457da115e33ba5aea120c" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.508812 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-error\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.508870 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-router-certs\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.508899 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-ocp-branding-template\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.508961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-login\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.509974 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hc9l\" (UniqueName: \"kubernetes.io/projected/ef0e747e-48cd-4a7d-922d-905e15ea750e-kube-api-access-9hc9l\") pod \"ef0e747e-48cd-4a7d-922d-905e15ea750e\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510007 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-cliconfig\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510029 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-config\") pod \"ef0e747e-48cd-4a7d-922d-905e15ea750e\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510052 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-client-ca\") pod \"05791985-4754-4e1e-9400-7204c6a5eab2\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510075 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-trusted-ca-bundle\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510097 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fw6r\" (UniqueName: \"kubernetes.io/projected/fddde3da-8512-4e62-9c38-b59f98e117e0-kube-api-access-4fw6r\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510147 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-proxy-ca-bundles\") pod \"05791985-4754-4e1e-9400-7204c6a5eab2\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510172 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-idp-0-file-data\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510193 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-client-ca\") pod \"ef0e747e-48cd-4a7d-922d-905e15ea750e\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-provider-selection\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510247 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0e747e-48cd-4a7d-922d-905e15ea750e-serving-cert\") pod \"ef0e747e-48cd-4a7d-922d-905e15ea750e\" (UID: \"ef0e747e-48cd-4a7d-922d-905e15ea750e\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510277 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-config\") pod \"05791985-4754-4e1e-9400-7204c6a5eab2\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510300 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-serving-cert\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05791985-4754-4e1e-9400-7204c6a5eab2-serving-cert\") pod \"05791985-4754-4e1e-9400-7204c6a5eab2\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510420 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-session\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510450 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-policies\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510473 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6s67\" (UniqueName: \"kubernetes.io/projected/05791985-4754-4e1e-9400-7204c6a5eab2-kube-api-access-l6s67\") pod \"05791985-4754-4e1e-9400-7204c6a5eab2\" (UID: \"05791985-4754-4e1e-9400-7204c6a5eab2\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-service-ca\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510527 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-dir\") pod \"fddde3da-8512-4e62-9c38-b59f98e117e0\" (UID: \"fddde3da-8512-4e62-9c38-b59f98e117e0\") " Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.510765 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.511870 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.512335 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-client-ca" (OuterVolumeSpecName: "client-ca") pod "05791985-4754-4e1e-9400-7204c6a5eab2" (UID: "05791985-4754-4e1e-9400-7204c6a5eab2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.512500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-config" (OuterVolumeSpecName: "config") pod "ef0e747e-48cd-4a7d-922d-905e15ea750e" (UID: "ef0e747e-48cd-4a7d-922d-905e15ea750e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.512522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-config" (OuterVolumeSpecName: "config") pod "05791985-4754-4e1e-9400-7204c6a5eab2" (UID: "05791985-4754-4e1e-9400-7204c6a5eab2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.513207 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "05791985-4754-4e1e-9400-7204c6a5eab2" (UID: "05791985-4754-4e1e-9400-7204c6a5eab2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.513624 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05791985-4754-4e1e-9400-7204c6a5eab2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "05791985-4754-4e1e-9400-7204c6a5eab2" (UID: "05791985-4754-4e1e-9400-7204c6a5eab2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.514040 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.515359 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef0e747e-48cd-4a7d-922d-905e15ea750e-kube-api-access-9hc9l" (OuterVolumeSpecName: "kube-api-access-9hc9l") pod "ef0e747e-48cd-4a7d-922d-905e15ea750e" (UID: "ef0e747e-48cd-4a7d-922d-905e15ea750e"). InnerVolumeSpecName "kube-api-access-9hc9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.515704 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.515927 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.516085 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef0e747e-48cd-4a7d-922d-905e15ea750e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ef0e747e-48cd-4a7d-922d-905e15ea750e" (UID: "ef0e747e-48cd-4a7d-922d-905e15ea750e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.516328 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05791985-4754-4e1e-9400-7204c6a5eab2-kube-api-access-l6s67" (OuterVolumeSpecName: "kube-api-access-l6s67") pod "05791985-4754-4e1e-9400-7204c6a5eab2" (UID: "05791985-4754-4e1e-9400-7204c6a5eab2"). InnerVolumeSpecName "kube-api-access-l6s67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.516483 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.516670 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.517019 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.517277 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-client-ca" (OuterVolumeSpecName: "client-ca") pod "ef0e747e-48cd-4a7d-922d-905e15ea750e" (UID: "ef0e747e-48cd-4a7d-922d-905e15ea750e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.518105 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.519825 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.520505 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fddde3da-8512-4e62-9c38-b59f98e117e0-kube-api-access-4fw6r" (OuterVolumeSpecName: "kube-api-access-4fw6r") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "kube-api-access-4fw6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.520651 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.520822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.523621 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fddde3da-8512-4e62-9c38-b59f98e117e0" (UID: "fddde3da-8512-4e62-9c38-b59f98e117e0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611299 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611336 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6s67\" (UniqueName: \"kubernetes.io/projected/05791985-4754-4e1e-9400-7204c6a5eab2-kube-api-access-l6s67\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611352 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611363 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fddde3da-8512-4e62-9c38-b59f98e117e0-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611374 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611385 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611396 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611408 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611419 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hc9l\" (UniqueName: \"kubernetes.io/projected/ef0e747e-48cd-4a7d-922d-905e15ea750e-kube-api-access-9hc9l\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611430 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611440 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611452 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611462 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611473 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fw6r\" (UniqueName: \"kubernetes.io/projected/fddde3da-8512-4e62-9c38-b59f98e117e0-kube-api-access-4fw6r\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611484 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611494 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611504 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef0e747e-48cd-4a7d-922d-905e15ea750e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611517 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611527 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef0e747e-48cd-4a7d-922d-905e15ea750e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611537 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05791985-4754-4e1e-9400-7204c6a5eab2-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611546 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611561 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05791985-4754-4e1e-9400-7204c6a5eab2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.611573 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fddde3da-8512-4e62-9c38-b59f98e117e0-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.800673 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8"] Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.823437 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbcc6866-hpvz8"] Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.829876 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd86b"] Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.839655 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd86b"] Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.844109 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bd4667b97-z2cn5"] Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.847753 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bd4667b97-z2cn5"] Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.977444 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f88f547f-kjxrc"] Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.977883 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="extract-utilities" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.977903 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="extract-utilities" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.977916 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="extract-utilities" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.977923 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="extract-utilities" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.977934 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerName="oauth-openshift" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.977943 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerName="oauth-openshift" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.977954 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="extract-content" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.977962 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="extract-content" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.977979 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6977ba5-d878-4457-a6d4-8acd42ebb089" containerName="pruner" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.977987 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6977ba5-d878-4457-a6d4-8acd42ebb089" containerName="pruner" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.978005 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8101654-10fd-404b-ae0f-a098719418f4" containerName="oc" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978012 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8101654-10fd-404b-ae0f-a098719418f4" containerName="oc" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.978026 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="registry-server" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978035 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="registry-server" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.978049 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="extract-content" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978056 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="extract-content" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.978065 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="registry-server" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978072 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="registry-server" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.978084 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05791985-4754-4e1e-9400-7204c6a5eab2" containerName="controller-manager" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978091 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="05791985-4754-4e1e-9400-7204c6a5eab2" containerName="controller-manager" Mar 21 04:52:37 crc kubenswrapper[4775]: E0321 04:52:37.978099 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef0e747e-48cd-4a7d-922d-905e15ea750e" containerName="route-controller-manager" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978108 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0e747e-48cd-4a7d-922d-905e15ea750e" containerName="route-controller-manager" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978239 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="05791985-4754-4e1e-9400-7204c6a5eab2" containerName="controller-manager" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978252 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6977ba5-d878-4457-a6d4-8acd42ebb089" containerName="pruner" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978263 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="96069bfd-088a-4053-ab37-76a04683a6a6" containerName="registry-server" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978275 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef0e747e-48cd-4a7d-922d-905e15ea750e" containerName="route-controller-manager" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978287 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" containerName="oauth-openshift" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978309 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8101654-10fd-404b-ae0f-a098719418f4" containerName="oc" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978321 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="de54a48a-b733-4042-80b6-ecc719712314" containerName="registry-server" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.978742 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.981039 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.981390 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.981581 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.982070 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.982279 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.983034 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.994649 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:52:37 crc kubenswrapper[4775]: I0321 04:52:37.996673 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f88f547f-kjxrc"] Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.022867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-proxy-ca-bundles\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.022914 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-client-ca\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.022942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7da7ce8b-402a-40f7-a9ce-340524ad8573-serving-cert\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.022984 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbhf\" (UniqueName: \"kubernetes.io/projected/7da7ce8b-402a-40f7-a9ce-340524ad8573-kube-api-access-flbhf\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.023017 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-config\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.124107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-config\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.124185 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-proxy-ca-bundles\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.124209 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-client-ca\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.124243 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7da7ce8b-402a-40f7-a9ce-340524ad8573-serving-cert\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.124284 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbhf\" (UniqueName: \"kubernetes.io/projected/7da7ce8b-402a-40f7-a9ce-340524ad8573-kube-api-access-flbhf\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.125529 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-proxy-ca-bundles\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.125667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-client-ca\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.126229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-config\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.128284 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7da7ce8b-402a-40f7-a9ce-340524ad8573-serving-cert\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.140818 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbhf\" (UniqueName: \"kubernetes.io/projected/7da7ce8b-402a-40f7-a9ce-340524ad8573-kube-api-access-flbhf\") pod \"controller-manager-6f88f547f-kjxrc\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.293345 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.476241 4775 generic.go:334] "Generic (PLEG): container finished" podID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerID="a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1" exitCode=0 Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.476627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svf9w" event={"ID":"1dcbca72-150a-47c6-ac3c-f701ae82e05b","Type":"ContainerDied","Data":"a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1"} Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.483583 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hldm7" event={"ID":"571e84f2-a2bc-4f09-ac53-d4a4adafa80b","Type":"ContainerStarted","Data":"bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b"} Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.488057 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbtgv" event={"ID":"ccb910ab-dcef-4523-81df-c0fb5eb83429","Type":"ContainerStarted","Data":"02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a"} Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.493905 4775 generic.go:334] "Generic (PLEG): container finished" podID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerID="1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0" exitCode=0 Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.493998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjfwd" event={"ID":"70ad413a-5f81-4094-b2d8-9b89698c6e32","Type":"ContainerDied","Data":"1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0"} Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.497897 4775 generic.go:334] "Generic (PLEG): container finished" podID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerID="93e802bba3a0a60860befd9b343fb138a5913ba96b3cd2219f5f37eec14db557" exitCode=0 Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.497943 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7k9df" event={"ID":"55e2733e-3620-4ecb-a51e-33b1fd3dce9d","Type":"ContainerDied","Data":"93e802bba3a0a60860befd9b343fb138a5913ba96b3cd2219f5f37eec14db557"} Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.554980 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wbtgv" podStartSLOduration=6.418065519 podStartE2EDuration="1m24.55496296s" podCreationTimestamp="2026-03-21 04:51:14 +0000 UTC" firstStartedPulling="2026-03-21 04:51:16.892010759 +0000 UTC m=+229.868474373" lastFinishedPulling="2026-03-21 04:52:35.02890819 +0000 UTC m=+308.005371814" observedRunningTime="2026-03-21 04:52:38.554581279 +0000 UTC m=+311.531044913" watchObservedRunningTime="2026-03-21 04:52:38.55496296 +0000 UTC m=+311.531426584" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.574915 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hldm7" podStartSLOduration=6.632767548 podStartE2EDuration="1m22.574894485s" podCreationTimestamp="2026-03-21 04:51:16 +0000 UTC" firstStartedPulling="2026-03-21 04:51:20.874312246 +0000 UTC m=+233.850775870" lastFinishedPulling="2026-03-21 04:52:36.816439173 +0000 UTC m=+309.792902807" observedRunningTime="2026-03-21 04:52:38.572748593 +0000 UTC m=+311.549212227" watchObservedRunningTime="2026-03-21 04:52:38.574894485 +0000 UTC m=+311.551358109" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.696885 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f88f547f-kjxrc"] Mar 21 04:52:38 crc kubenswrapper[4775]: W0321 04:52:38.701482 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7da7ce8b_402a_40f7_a9ce_340524ad8573.slice/crio-2e2beab81a4014c294e6052e86efc7d9c966045a25b61fae9abde1855c0e91eb WatchSource:0}: Error finding container 2e2beab81a4014c294e6052e86efc7d9c966045a25b61fae9abde1855c0e91eb: Status 404 returned error can't find the container with id 2e2beab81a4014c294e6052e86efc7d9c966045a25b61fae9abde1855c0e91eb Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.975751 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z"] Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.976891 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.980182 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.980283 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.980720 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.980935 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.981187 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.981244 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:52:38 crc kubenswrapper[4775]: I0321 04:52:38.992241 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z"] Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.137724 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-config\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.137771 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-client-ca\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.137877 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65v5z\" (UniqueName: \"kubernetes.io/projected/1882b1c5-2465-40df-9306-e5792e0d9f2f-kube-api-access-65v5z\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.138224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1882b1c5-2465-40df-9306-e5792e0d9f2f-serving-cert\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.239628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1882b1c5-2465-40df-9306-e5792e0d9f2f-serving-cert\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.240191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-config\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.240226 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-client-ca\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.240378 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65v5z\" (UniqueName: \"kubernetes.io/projected/1882b1c5-2465-40df-9306-e5792e0d9f2f-kube-api-access-65v5z\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.241249 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-client-ca\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.241616 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-config\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.245634 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1882b1c5-2465-40df-9306-e5792e0d9f2f-serving-cert\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.261182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65v5z\" (UniqueName: \"kubernetes.io/projected/1882b1c5-2465-40df-9306-e5792e0d9f2f-kube-api-access-65v5z\") pod \"route-controller-manager-6d6c87d469-rgj5z\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.325136 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.515249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svf9w" event={"ID":"1dcbca72-150a-47c6-ac3c-f701ae82e05b","Type":"ContainerStarted","Data":"924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a"} Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.517892 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" event={"ID":"7da7ce8b-402a-40f7-a9ce-340524ad8573","Type":"ContainerStarted","Data":"b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f"} Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.517926 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" event={"ID":"7da7ce8b-402a-40f7-a9ce-340524ad8573","Type":"ContainerStarted","Data":"2e2beab81a4014c294e6052e86efc7d9c966045a25b61fae9abde1855c0e91eb"} Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.518144 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.521772 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjfwd" event={"ID":"70ad413a-5f81-4094-b2d8-9b89698c6e32","Type":"ContainerStarted","Data":"dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b"} Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.525977 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.532554 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7k9df" event={"ID":"55e2733e-3620-4ecb-a51e-33b1fd3dce9d","Type":"ContainerStarted","Data":"e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402"} Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.535696 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-svf9w" podStartSLOduration=3.283391472 podStartE2EDuration="1m25.535684827s" podCreationTimestamp="2026-03-21 04:51:14 +0000 UTC" firstStartedPulling="2026-03-21 04:51:16.879404137 +0000 UTC m=+229.855867761" lastFinishedPulling="2026-03-21 04:52:39.131697492 +0000 UTC m=+312.108161116" observedRunningTime="2026-03-21 04:52:39.533670869 +0000 UTC m=+312.510134493" watchObservedRunningTime="2026-03-21 04:52:39.535684827 +0000 UTC m=+312.512148451" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.572009 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" podStartSLOduration=5.571987155 podStartE2EDuration="5.571987155s" podCreationTimestamp="2026-03-21 04:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:52:39.55726094 +0000 UTC m=+312.533724564" watchObservedRunningTime="2026-03-21 04:52:39.571987155 +0000 UTC m=+312.548450779" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.606877 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7k9df" podStartSLOduration=9.692104696 podStartE2EDuration="1m23.606859242s" podCreationTimestamp="2026-03-21 04:51:16 +0000 UTC" firstStartedPulling="2026-03-21 04:51:24.991229569 +0000 UTC m=+237.967693193" lastFinishedPulling="2026-03-21 04:52:38.905984115 +0000 UTC m=+311.882447739" observedRunningTime="2026-03-21 04:52:39.605166223 +0000 UTC m=+312.581629847" watchObservedRunningTime="2026-03-21 04:52:39.606859242 +0000 UTC m=+312.583322866" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.607772 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wjfwd" podStartSLOduration=8.712020172 podStartE2EDuration="1m22.607763008s" podCreationTimestamp="2026-03-21 04:51:17 +0000 UTC" firstStartedPulling="2026-03-21 04:51:25.042959754 +0000 UTC m=+238.019423378" lastFinishedPulling="2026-03-21 04:52:38.93870259 +0000 UTC m=+311.915166214" observedRunningTime="2026-03-21 04:52:39.579462211 +0000 UTC m=+312.555925835" watchObservedRunningTime="2026-03-21 04:52:39.607763008 +0000 UTC m=+312.584226632" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.670761 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05791985-4754-4e1e-9400-7204c6a5eab2" path="/var/lib/kubelet/pods/05791985-4754-4e1e-9400-7204c6a5eab2/volumes" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.671609 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef0e747e-48cd-4a7d-922d-905e15ea750e" path="/var/lib/kubelet/pods/ef0e747e-48cd-4a7d-922d-905e15ea750e/volumes" Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.672240 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fddde3da-8512-4e62-9c38-b59f98e117e0" path="/var/lib/kubelet/pods/fddde3da-8512-4e62-9c38-b59f98e117e0/volumes" Mar 21 04:52:39 crc kubenswrapper[4775]: W0321 04:52:39.736665 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1882b1c5_2465_40df_9306_e5792e0d9f2f.slice/crio-1a6793f0695cc3d26a11dee78d3b1e2f638d464d822526dcc3f7aa8f363599f1 WatchSource:0}: Error finding container 1a6793f0695cc3d26a11dee78d3b1e2f638d464d822526dcc3f7aa8f363599f1: Status 404 returned error can't find the container with id 1a6793f0695cc3d26a11dee78d3b1e2f638d464d822526dcc3f7aa8f363599f1 Mar 21 04:52:39 crc kubenswrapper[4775]: I0321 04:52:39.737323 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z"] Mar 21 04:52:40 crc kubenswrapper[4775]: I0321 04:52:40.550336 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" event={"ID":"1882b1c5-2465-40df-9306-e5792e0d9f2f","Type":"ContainerStarted","Data":"ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b"} Mar 21 04:52:40 crc kubenswrapper[4775]: I0321 04:52:40.550394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" event={"ID":"1882b1c5-2465-40df-9306-e5792e0d9f2f","Type":"ContainerStarted","Data":"1a6793f0695cc3d26a11dee78d3b1e2f638d464d822526dcc3f7aa8f363599f1"} Mar 21 04:52:40 crc kubenswrapper[4775]: I0321 04:52:40.550887 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:40 crc kubenswrapper[4775]: I0321 04:52:40.555527 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:52:40 crc kubenswrapper[4775]: I0321 04:52:40.568816 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" podStartSLOduration=6.568799767 podStartE2EDuration="6.568799767s" podCreationTimestamp="2026-03-21 04:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:52:40.565519792 +0000 UTC m=+313.541983416" watchObservedRunningTime="2026-03-21 04:52:40.568799767 +0000 UTC m=+313.545263391" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.901471 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.902212 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.902474 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.903020 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70" gracePeriod=15 Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.903063 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401" gracePeriod=15 Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.903100 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc" gracePeriod=15 Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.903100 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb" gracePeriod=15 Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.903252 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948" gracePeriod=15 Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904578 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904715 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904731 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904738 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904744 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904750 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904756 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904766 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904772 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904782 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904788 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904795 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904800 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904808 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904814 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.904821 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904827 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904938 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904947 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904954 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904963 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.904989 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.905000 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.905009 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.906110 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.906143 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.906259 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.906268 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: E0321 04:52:41.906362 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.906370 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.960655 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981640 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981726 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981774 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981792 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981807 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981836 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:41 crc kubenswrapper[4775]: I0321 04:52:41.981851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085673 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085746 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085766 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085778 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085819 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085825 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085839 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085859 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085912 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085922 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.086017 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.086026 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.086061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.086077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.085916 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.086164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.258924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:52:42 crc kubenswrapper[4775]: W0321 04:52:42.285713 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4e3fc3de5b7765ff18c8282fe983ce2daa4b942f0497b52cf4a498cd4d684581 WatchSource:0}: Error finding container 4e3fc3de5b7765ff18c8282fe983ce2daa4b942f0497b52cf4a498cd4d684581: Status 404 returned error can't find the container with id 4e3fc3de5b7765ff18c8282fe983ce2daa4b942f0497b52cf4a498cd4d684581 Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.289301 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec223cf4ca9d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:52:42.288105939 +0000 UTC m=+315.264569563,LastTimestamp:2026-03-21 04:52:42.288105939 +0000 UTC m=+315.264569563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.568038 4775 generic.go:334] "Generic (PLEG): container finished" podID="bc0e122c-1775-4bf1-9025-6288c383b3f2" containerID="de0bb723a8e5a2f1ecac5f9d70f09000ecc4cbf0b2ce82838fe0f1932e47f097" exitCode=0 Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.568157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc0e122c-1775-4bf1-9025-6288c383b3f2","Type":"ContainerDied","Data":"de0bb723a8e5a2f1ecac5f9d70f09000ecc4cbf0b2ce82838fe0f1932e47f097"} Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.572349 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.572677 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.572929 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.577214 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.578662 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.579255 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401" exitCode=0 Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.579277 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948" exitCode=0 Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.579286 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb" exitCode=0 Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.579294 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc" exitCode=2 Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.579370 4775 scope.go:117] "RemoveContainer" containerID="85e412eedce18f7222c49be795b325ffdb05306df25f8ff195a08162142324b4" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.582299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4e3fc3de5b7765ff18c8282fe983ce2daa4b942f0497b52cf4a498cd4d684581"} Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.583191 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.583630 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.584009 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.736927 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.737392 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.737817 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.738111 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.738433 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:42 crc kubenswrapper[4775]: I0321 04:52:42.738480 4775 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.738780 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="200ms" Mar 21 04:52:42 crc kubenswrapper[4775]: E0321 04:52:42.939894 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="400ms" Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.133995 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.134091 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: E0321 04:52:43.227181 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:43Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:43Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:43Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:43Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:04dd6a6688563ae1dbeb924643a49d81150c900e2f7e81aaab8a3586e6f53044\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:bc9a28a1c3a5f16a1873e4919f236a9824c62c6394093adb4850b30daf95761c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252939492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0634296304362d86f6b8d0de6c3e1b327670f86a3e0a664a7fd2fa3984fbbb18\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:1a955538926dc95f0f30489b33552cf1f18ae70528576ef7128acde9f7d9a0b6\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223645396},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: E0321 04:52:43.227988 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: E0321 04:52:43.228470 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: E0321 04:52:43.228852 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: E0321 04:52:43.229213 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: E0321 04:52:43.229241 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.276572 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.276665 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: E0321 04:52:43.340580 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="800ms" Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.592438 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.595925 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b"} Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.918065 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.919185 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:43 crc kubenswrapper[4775]: I0321 04:52:43.919696 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.009603 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-kubelet-dir\") pod \"bc0e122c-1775-4bf1-9025-6288c383b3f2\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.010127 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-var-lock\") pod \"bc0e122c-1775-4bf1-9025-6288c383b3f2\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.010176 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc0e122c-1775-4bf1-9025-6288c383b3f2-kube-api-access\") pod \"bc0e122c-1775-4bf1-9025-6288c383b3f2\" (UID: \"bc0e122c-1775-4bf1-9025-6288c383b3f2\") " Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.010624 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc0e122c-1775-4bf1-9025-6288c383b3f2" (UID: "bc0e122c-1775-4bf1-9025-6288c383b3f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.010710 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-var-lock" (OuterVolumeSpecName: "var-lock") pod "bc0e122c-1775-4bf1-9025-6288c383b3f2" (UID: "bc0e122c-1775-4bf1-9025-6288c383b3f2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.025568 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0e122c-1775-4bf1-9025-6288c383b3f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc0e122c-1775-4bf1-9025-6288c383b3f2" (UID: "bc0e122c-1775-4bf1-9025-6288c383b3f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.111759 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.111829 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc0e122c-1775-4bf1-9025-6288c383b3f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.111843 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc0e122c-1775-4bf1-9025-6288c383b3f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:44 crc kubenswrapper[4775]: E0321 04:52:44.141934 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="1.6s" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.279347 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.280140 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.281884 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.282155 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.282357 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.313795 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.313861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.313893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.314161 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.314190 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.314206 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.415336 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.415379 4775 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.415392 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.541167 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.541281 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.582332 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.582990 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.583643 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.583933 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.584264 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.604801 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.605814 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70" exitCode=0 Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.605897 4775 scope.go:117] "RemoveContainer" containerID="aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.606065 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.608929 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.608839 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc0e122c-1775-4bf1-9025-6288c383b3f2","Type":"ContainerDied","Data":"df0920ca1c2002e3d0e08d9fcffa648209bafd25e83252350dced0f61be9bf09"} Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.609355 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0920ca1c2002e3d0e08d9fcffa648209bafd25e83252350dced0f61be9bf09" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.620449 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.620663 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.620832 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.620980 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.625897 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.626653 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.627076 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.627449 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.630341 4775 scope.go:117] "RemoveContainer" containerID="1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.643709 4775 scope.go:117] "RemoveContainer" containerID="fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.659917 4775 scope.go:117] "RemoveContainer" containerID="fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.662718 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.663577 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.663965 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.664473 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.664768 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.686417 4775 scope.go:117] "RemoveContainer" containerID="3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.704468 4775 scope.go:117] "RemoveContainer" containerID="381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.733268 4775 scope.go:117] "RemoveContainer" containerID="aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401" Mar 21 04:52:44 crc kubenswrapper[4775]: E0321 04:52:44.733846 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\": container with ID starting with aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401 not found: ID does not exist" containerID="aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.733888 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401"} err="failed to get container status \"aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\": rpc error: code = NotFound desc = could not find container \"aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401\": container with ID starting with aa32e604ef1a820af34bd67cc5002877686b95aab86676f842ac1761e82cc401 not found: ID does not exist" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.733915 4775 scope.go:117] "RemoveContainer" containerID="1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948" Mar 21 04:52:44 crc kubenswrapper[4775]: E0321 04:52:44.734414 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\": container with ID starting with 1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948 not found: ID does not exist" containerID="1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.734443 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948"} err="failed to get container status \"1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\": rpc error: code = NotFound desc = could not find container \"1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948\": container with ID starting with 1e58586e2fb6aa3271522f23c55846e0e6001fdd90935c516a9a6e7a3e959948 not found: ID does not exist" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.734463 4775 scope.go:117] "RemoveContainer" containerID="fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb" Mar 21 04:52:44 crc kubenswrapper[4775]: E0321 04:52:44.734772 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\": container with ID starting with fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb not found: ID does not exist" containerID="fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.734819 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb"} err="failed to get container status \"fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\": rpc error: code = NotFound desc = could not find container \"fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb\": container with ID starting with fa001dc47bf149b06b215b924a37cac3440e089b361911df93a4c0143a3f53cb not found: ID does not exist" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.734849 4775 scope.go:117] "RemoveContainer" containerID="fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc" Mar 21 04:52:44 crc kubenswrapper[4775]: E0321 04:52:44.735106 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\": container with ID starting with fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc not found: ID does not exist" containerID="fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.735139 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc"} err="failed to get container status \"fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\": rpc error: code = NotFound desc = could not find container \"fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc\": container with ID starting with fa4697d53ba36739e797dace487196678b2dc17d70e745df89372d46af5657bc not found: ID does not exist" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.735154 4775 scope.go:117] "RemoveContainer" containerID="3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70" Mar 21 04:52:44 crc kubenswrapper[4775]: E0321 04:52:44.736405 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\": container with ID starting with 3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70 not found: ID does not exist" containerID="3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.736425 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70"} err="failed to get container status \"3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\": rpc error: code = NotFound desc = could not find container \"3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70\": container with ID starting with 3e874e78570f0b3a6bd4016ccfa09d4c6b81827dc006f3b17f9c93dc65bcac70 not found: ID does not exist" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.736437 4775 scope.go:117] "RemoveContainer" containerID="381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec" Mar 21 04:52:44 crc kubenswrapper[4775]: E0321 04:52:44.736638 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\": container with ID starting with 381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec not found: ID does not exist" containerID="381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.736671 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec"} err="failed to get container status \"381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\": rpc error: code = NotFound desc = could not find container \"381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec\": container with ID starting with 381f5c10200a4e21bc69a5b9f8eca747d803e177b8497ebe7d82f12c98aea3ec not found: ID does not exist" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.961506 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:52:44 crc kubenswrapper[4775]: I0321 04:52:44.961569 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.000088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.000500 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.000767 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.001061 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.001444 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.001686 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.660183 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.661152 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.661691 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.661952 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.662416 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.662867 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:45 crc kubenswrapper[4775]: I0321 04:52:45.667726 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 21 04:52:45 crc kubenswrapper[4775]: E0321 04:52:45.744252 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="3.2s" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.722588 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.723163 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.781914 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.782735 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.783546 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.783864 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.784192 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:46 crc kubenswrapper[4775]: I0321 04:52:46.784527 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.165331 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.165377 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.204024 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.204769 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.205030 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.205411 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.205870 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.206090 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.206339 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: E0321 04:52:47.497481 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec223cf4ca9d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:52:42.288105939 +0000 UTC m=+315.264569563,LastTimestamp:2026-03-21 04:52:42.288105939 +0000 UTC m=+315.264569563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.540880 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.540953 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.592182 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.592909 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.593446 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.593798 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.594450 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.594970 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.595756 4775 status_manager.go:851] "Failed to get status for pod" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" pod="openshift-marketplace/redhat-operators-wjfwd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wjfwd\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.596244 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.665778 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.666391 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.666755 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.668235 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.668672 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.669099 4775 status_manager.go:851] "Failed to get status for pod" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" pod="openshift-marketplace/redhat-operators-wjfwd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wjfwd\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.669902 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.678540 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.679214 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.679333 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.679942 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.680470 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.681229 4775 status_manager.go:851] "Failed to get status for pod" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" pod="openshift-marketplace/redhat-operators-wjfwd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wjfwd\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.681657 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.682078 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.682526 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.683321 4775 status_manager.go:851] "Failed to get status for pod" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" pod="openshift-marketplace/redhat-operators-wjfwd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wjfwd\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.683906 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.684377 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.684823 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.685342 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.685796 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.686084 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.688984 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.689345 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.689667 4775 status_manager.go:851] "Failed to get status for pod" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" pod="openshift-marketplace/redhat-operators-wjfwd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wjfwd\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.690064 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.690535 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.690815 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.691199 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:47 crc kubenswrapper[4775]: I0321 04:52:47.691620 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:48 crc kubenswrapper[4775]: E0321 04:52:48.945412 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="6.4s" Mar 21 04:52:53 crc kubenswrapper[4775]: E0321 04:52:53.239429 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:52:53Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:04dd6a6688563ae1dbeb924643a49d81150c900e2f7e81aaab8a3586e6f53044\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:bc9a28a1c3a5f16a1873e4919f236a9824c62c6394093adb4850b30daf95761c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252939492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0634296304362d86f6b8d0de6c3e1b327670f86a3e0a664a7fd2fa3984fbbb18\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:1a955538926dc95f0f30489b33552cf1f18ae70528576ef7128acde9f7d9a0b6\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223645396},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: E0321 04:52:53.241214 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: E0321 04:52:53.241659 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: E0321 04:52:53.243317 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: E0321 04:52:53.243546 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: E0321 04:52:53.243564 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.660762 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.661528 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.661884 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.662210 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.662555 4775 status_manager.go:851] "Failed to get status for pod" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" pod="openshift-marketplace/redhat-operators-wjfwd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wjfwd\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.662918 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.663193 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.663581 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.680112 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.680178 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:52:53 crc kubenswrapper[4775]: E0321 04:52:53.680550 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:53 crc kubenswrapper[4775]: I0321 04:52:53.680988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:53 crc kubenswrapper[4775]: W0321 04:52:53.699148 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-1f41636626addbc72b3497e2fe8340a674aebde57e937a756d885b1e10949d0b WatchSource:0}: Error finding container 1f41636626addbc72b3497e2fe8340a674aebde57e937a756d885b1e10949d0b: Status 404 returned error can't find the container with id 1f41636626addbc72b3497e2fe8340a674aebde57e937a756d885b1e10949d0b Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.668879 4775 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="378b58294d33a1a18c3716c9a58fc22e327620d44b86d33adc8bb7640257fe85" exitCode=0 Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.668979 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"378b58294d33a1a18c3716c9a58fc22e327620d44b86d33adc8bb7640257fe85"} Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.669243 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1f41636626addbc72b3497e2fe8340a674aebde57e937a756d885b1e10949d0b"} Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.669513 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.669529 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:52:54 crc kubenswrapper[4775]: E0321 04:52:54.669987 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.671754 4775 status_manager.go:851] "Failed to get status for pod" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" pod="openshift-marketplace/community-operators-wbtgv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-wbtgv\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.672261 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.672576 4775 status_manager.go:851] "Failed to get status for pod" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" pod="openshift-marketplace/community-operators-svf9w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-svf9w\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.672837 4775 status_manager.go:851] "Failed to get status for pod" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" pod="openshift-marketplace/redhat-marketplace-hldm7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-hldm7\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.673051 4775 status_manager.go:851] "Failed to get status for pod" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.673335 4775 status_manager.go:851] "Failed to get status for pod" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" pod="openshift-marketplace/redhat-operators-wjfwd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-wjfwd\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:54 crc kubenswrapper[4775]: I0321 04:52:54.673813 4775 status_manager.go:851] "Failed to get status for pod" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" pod="openshift-marketplace/redhat-marketplace-7k9df" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7k9df\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 21 04:52:55 crc kubenswrapper[4775]: I0321 04:52:55.677992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1190cd41522dfa2bcdaf0b043b1b6e08f1d536aefb6a5b94f237c6bf9a884f03"} Mar 21 04:52:55 crc kubenswrapper[4775]: I0321 04:52:55.678603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c4b28862738ff9483688e677523118653d5826d836f2a12d150db9feaea262ba"} Mar 21 04:52:55 crc kubenswrapper[4775]: I0321 04:52:55.678618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"acf01814aa883f6c53895f25cdd6f332afbc3dce1f7d957c3686d6cfbe640bc7"} Mar 21 04:52:55 crc kubenswrapper[4775]: I0321 04:52:55.678648 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9bc226a97f0f60ca373e5050ef0d1e1c7a82c039c94c9976d98d9b115d6d2dc9"} Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.689948 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.691199 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.691285 4775 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b" exitCode=1 Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.691385 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b"} Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.692902 4775 scope.go:117] "RemoveContainer" containerID="75cf7f90f0971bce2e51690954080e297523b92a5588c8effa75832879f0dd6b" Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.697705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"34b753c2ca5cb3f8746f3ffe7cc0f8289b2f44cd7679e8ac546b87a7bd71bf17"} Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.698034 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.698068 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:52:56 crc kubenswrapper[4775]: I0321 04:52:56.698338 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:57 crc kubenswrapper[4775]: I0321 04:52:57.705548 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:52:57 crc kubenswrapper[4775]: I0321 04:52:57.706437 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:52:57 crc kubenswrapper[4775]: I0321 04:52:57.706487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8a313e0d3f3cfd85aab72e5d3551cfa434d04205dc6565ceaaff0c072352a77"} Mar 21 04:52:58 crc kubenswrapper[4775]: I0321 04:52:58.681881 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:58 crc kubenswrapper[4775]: I0321 04:52:58.681964 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:52:58 crc kubenswrapper[4775]: I0321 04:52:58.688945 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:53:01 crc kubenswrapper[4775]: I0321 04:53:01.709157 4775 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:53:01 crc kubenswrapper[4775]: I0321 04:53:01.730532 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:53:01 crc kubenswrapper[4775]: I0321 04:53:01.730578 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:53:01 crc kubenswrapper[4775]: I0321 04:53:01.737545 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:53:01 crc kubenswrapper[4775]: I0321 04:53:01.905094 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="88571c4e-5944-4868-b298-4cacd3831adc" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.467052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.467227 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.469340 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.469368 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.479749 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.485270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.736491 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.737248 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.739738 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="88571c4e-5944-4868-b298-4cacd3831adc" Mar 21 04:53:02 crc kubenswrapper[4775]: I0321 04:53:02.783736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:53:03 crc kubenswrapper[4775]: W0321 04:53:03.176623 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-4c2f493ebe9ff005b4a2558285dab577d4b15c2f387c7d7ba5d4f6b52621d799 WatchSource:0}: Error finding container 4c2f493ebe9ff005b4a2558285dab577d4b15c2f387c7d7ba5d4f6b52621d799: Status 404 returned error can't find the container with id 4c2f493ebe9ff005b4a2558285dab577d4b15c2f387c7d7ba5d4f6b52621d799 Mar 21 04:53:03 crc kubenswrapper[4775]: I0321 04:53:03.744060 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"aab1d0277ec455f9d7273f185cd4c536fde8444247b04b3d86da1923ad2d0d0e"} Mar 21 04:53:03 crc kubenswrapper[4775]: I0321 04:53:03.744463 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4c2f493ebe9ff005b4a2558285dab577d4b15c2f387c7d7ba5d4f6b52621d799"} Mar 21 04:53:05 crc kubenswrapper[4775]: I0321 04:53:05.017236 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:53:05 crc kubenswrapper[4775]: I0321 04:53:05.616337 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:53:05 crc kubenswrapper[4775]: I0321 04:53:05.620063 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:53:06 crc kubenswrapper[4775]: I0321 04:53:06.768647 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:53:11 crc kubenswrapper[4775]: I0321 04:53:11.010301 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:53:11 crc kubenswrapper[4775]: I0321 04:53:11.734496 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:53:12 crc kubenswrapper[4775]: I0321 04:53:12.552773 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:12 crc kubenswrapper[4775]: I0321 04:53:12.791462 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:53:12 crc kubenswrapper[4775]: I0321 04:53:12.871583 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:53:12 crc kubenswrapper[4775]: I0321 04:53:12.989080 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:53:13 crc kubenswrapper[4775]: I0321 04:53:13.091925 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:53:13 crc kubenswrapper[4775]: I0321 04:53:13.176492 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:53:13 crc kubenswrapper[4775]: I0321 04:53:13.319205 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:53:13 crc kubenswrapper[4775]: I0321 04:53:13.319783 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:53:13 crc kubenswrapper[4775]: I0321 04:53:13.467627 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.012458 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.335931 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.461972 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.481924 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.618231 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.762179 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.773486 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.833888 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:53:14 crc kubenswrapper[4775]: I0321 04:53:14.959497 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.292571 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.311453 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.453755 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.628622 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.726398 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.765665 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.879044 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.906721 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:53:15 crc kubenswrapper[4775]: I0321 04:53:15.911395 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.004179 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.070858 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.121835 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.130563 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.173724 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.197087 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.218498 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.425797 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.554981 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.768911 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.793529 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.981987 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:53:16 crc kubenswrapper[4775]: I0321 04:53:16.997942 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.112261 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.138728 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.541150 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.599222 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.784345 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.797922 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.877517 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:53:17 crc kubenswrapper[4775]: I0321 04:53:17.961544 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.030394 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.035553 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.139015 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.156646 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.272332 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.373986 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.385698 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.452102 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.521654 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.525093 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.671022 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.736086 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.742490 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.771879 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.775516 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.783817 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.902269 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.903755 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:53:18 crc kubenswrapper[4775]: I0321 04:53:18.904875 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.017615 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.111882 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.351525 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.458622 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.486107 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.509235 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.509902 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.530012 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.590897 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.647411 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.649762 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.695213 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.767496 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.778594 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.786929 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.844694 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.868622 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.886687 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.888034 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:53:19 crc kubenswrapper[4775]: I0321 04:53:19.977204 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.081426 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.088101 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.090860 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.092212 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.138748 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.176327 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.178952 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.276585 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.277741 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.290320 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.315507 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.644692 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.658918 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.663401 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.828439 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.841758 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.844411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:53:20 crc kubenswrapper[4775]: I0321 04:53:20.994260 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.021807 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.078845 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.252477 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.278648 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.286965 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.338194 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.380836 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.461840 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.491245 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.517088 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.671815 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.707550 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.741701 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.749140 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.801670 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.845776 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.875545 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.929268 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.941343 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.941448 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:53:21 crc kubenswrapper[4775]: I0321 04:53:21.990446 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.048895 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.201615 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.225782 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.260249 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.303467 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.319863 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.379488 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.417876 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.450330 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.470040 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.527035 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.644640 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.650309 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.692262 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.779507 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.819401 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.924697 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.937997 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.963812 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:53:22 crc kubenswrapper[4775]: I0321 04:53:22.979770 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.082463 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.083031 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.223453 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.246683 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.375740 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.398759 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.535494 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.536022 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.536008707 podStartE2EDuration="42.536008707s" podCreationTimestamp="2026-03-21 04:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:53:01.759528412 +0000 UTC m=+334.735992046" watchObservedRunningTime="2026-03-21 04:53:23.536008707 +0000 UTC m=+356.512472331" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.539310 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.539354 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7d8984bd85-5vxq9","openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:53:23 crc kubenswrapper[4775]: E0321 04:53:23.539518 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" containerName="installer" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.539533 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" containerName="installer" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.539640 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0e122c-1775-4bf1-9025-6288c383b3f2" containerName="installer" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.539984 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.539982 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.540372 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbc8474b-4360-449d-ab37-ba14ca1ac5ed" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.543027 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.543296 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.543349 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.544522 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.544542 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.544619 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.544902 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.546307 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.546381 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.546436 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.546542 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.546709 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.547195 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.552464 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.558098 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.561107 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.561308 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.568941 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.568924427 podStartE2EDuration="22.568924427s" podCreationTimestamp="2026-03-21 04:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:53:23.567080794 +0000 UTC m=+356.543544428" watchObservedRunningTime="2026-03-21 04:53:23.568924427 +0000 UTC m=+356.545388051" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.588604 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.641683 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.641759 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-audit-dir\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.641848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-session\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.641949 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.641973 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.641993 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642010 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-audit-policies\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642064 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642158 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2cqt\" (UniqueName: \"kubernetes.io/projected/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-kube-api-access-k2cqt\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642247 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-login\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642266 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-error\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.642294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.648498 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.681281 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.734281 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743671 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-audit-dir\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743708 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-session\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743747 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743765 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743785 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743807 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743828 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-audit-policies\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743830 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-audit-dir\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743845 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2cqt\" (UniqueName: \"kubernetes.io/projected/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-kube-api-access-k2cqt\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.743980 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.744001 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-login\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.744018 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-error\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.744038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.744065 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.745812 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.746094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.746271 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.746508 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-audit-policies\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.751907 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.752849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-session\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.752865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.753202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.753979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-login\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.754652 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-user-template-error\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.756852 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.759666 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.763803 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2cqt\" (UniqueName: \"kubernetes.io/projected/de6d7fec-5fbc-4fc4-aae8-ecea20324c19-kube-api-access-k2cqt\") pod \"oauth-openshift-7d8984bd85-5vxq9\" (UID: \"de6d7fec-5fbc-4fc4-aae8-ecea20324c19\") " pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.851679 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.864669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:23 crc kubenswrapper[4775]: I0321 04:53:23.969221 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.027902 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.028203 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b" gracePeriod=5 Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.072166 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.110416 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.130194 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.176336 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.188551 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.198770 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.209608 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.217402 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.313839 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.343814 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d8984bd85-5vxq9"] Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.366547 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.392107 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.406466 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.450837 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.468861 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.613374 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.613525 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.623916 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.676457 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.677379 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.758946 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.868724 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.873331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" event={"ID":"de6d7fec-5fbc-4fc4-aae8-ecea20324c19","Type":"ContainerStarted","Data":"83bd7fd026ec0926eed8900ca51da6279293b19fe53634179a6f0265deef8faf"} Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.873394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" event={"ID":"de6d7fec-5fbc-4fc4-aae8-ecea20324c19","Type":"ContainerStarted","Data":"d89bd894b9223c35f5fe3e89582520f5860fac99b7d3bfea7b88fd00a356bb4b"} Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.873762 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.888981 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.894738 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.895956 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" podStartSLOduration=78.895941402 podStartE2EDuration="1m18.895941402s" podCreationTimestamp="2026-03-21 04:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:53:24.893210174 +0000 UTC m=+357.869673818" watchObservedRunningTime="2026-03-21 04:53:24.895941402 +0000 UTC m=+357.872405026" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.916741 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.951796 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.956729 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.960748 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:53:24 crc kubenswrapper[4775]: I0321 04:53:24.993137 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7d8984bd85-5vxq9" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.089582 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.127632 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.177262 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.183240 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.199485 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.241369 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.291187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.682513 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.876371 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.884981 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:53:25 crc kubenswrapper[4775]: I0321 04:53:25.941423 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.046002 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.102756 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.178233 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.480355 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.488980 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.565454 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.596310 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:53:26 crc kubenswrapper[4775]: I0321 04:53:26.629777 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.095244 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.097572 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.151441 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.167839 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.381788 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.477238 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.581566 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.597900 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.687983 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.783295 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.855096 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:53:27 crc kubenswrapper[4775]: I0321 04:53:27.962987 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.009950 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.056556 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.353155 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.415174 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.449163 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.629690 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.673206 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:53:28 crc kubenswrapper[4775]: I0321 04:53:28.941053 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.010465 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.127489 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.377691 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.621457 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.621599 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.670916 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.682370 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.682407 4775 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="bd82783d-b559-4327-af5f-eb6d13c77df1" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.687472 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.687530 4775 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="bd82783d-b559-4327-af5f-eb6d13c77df1" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.694482 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.714759 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729369 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729524 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729523 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729692 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729768 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729826 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.729921 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.730105 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.730341 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.730387 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.730413 4775 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.730436 4775 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.739676 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.831076 4775 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.905429 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.905523 4775 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b" exitCode=137 Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.905596 4775 scope.go:117] "RemoveContainer" containerID="b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.905595 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.932416 4775 scope.go:117] "RemoveContainer" containerID="b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b" Mar 21 04:53:29 crc kubenswrapper[4775]: E0321 04:53:29.933654 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b\": container with ID starting with b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b not found: ID does not exist" containerID="b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b" Mar 21 04:53:29 crc kubenswrapper[4775]: I0321 04:53:29.933777 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b"} err="failed to get container status \"b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b\": rpc error: code = NotFound desc = could not find container \"b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b\": container with ID starting with b2996b891633e8617fdbfa635d270d54374602f00b9f36c117a64b0b6e789a4b not found: ID does not exist" Mar 21 04:53:30 crc kubenswrapper[4775]: I0321 04:53:30.134638 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:53:30 crc kubenswrapper[4775]: I0321 04:53:30.682047 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:53:31 crc kubenswrapper[4775]: I0321 04:53:31.668832 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.379850 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f88f547f-kjxrc"] Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.381216 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" podUID="7da7ce8b-402a-40f7-a9ce-340524ad8573" containerName="controller-manager" containerID="cri-o://b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f" gracePeriod=30 Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.487029 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z"] Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.487318 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" podUID="1882b1c5-2465-40df-9306-e5792e0d9f2f" containerName="route-controller-manager" containerID="cri-o://ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b" gracePeriod=30 Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.769546 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.877904 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.903917 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flbhf\" (UniqueName: \"kubernetes.io/projected/7da7ce8b-402a-40f7-a9ce-340524ad8573-kube-api-access-flbhf\") pod \"7da7ce8b-402a-40f7-a9ce-340524ad8573\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.904018 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-config\") pod \"7da7ce8b-402a-40f7-a9ce-340524ad8573\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.904057 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7da7ce8b-402a-40f7-a9ce-340524ad8573-serving-cert\") pod \"7da7ce8b-402a-40f7-a9ce-340524ad8573\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.904086 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-client-ca\") pod \"7da7ce8b-402a-40f7-a9ce-340524ad8573\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.904157 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-proxy-ca-bundles\") pod \"7da7ce8b-402a-40f7-a9ce-340524ad8573\" (UID: \"7da7ce8b-402a-40f7-a9ce-340524ad8573\") " Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.905181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7da7ce8b-402a-40f7-a9ce-340524ad8573" (UID: "7da7ce8b-402a-40f7-a9ce-340524ad8573"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.905493 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-client-ca" (OuterVolumeSpecName: "client-ca") pod "7da7ce8b-402a-40f7-a9ce-340524ad8573" (UID: "7da7ce8b-402a-40f7-a9ce-340524ad8573"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.905817 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-config" (OuterVolumeSpecName: "config") pod "7da7ce8b-402a-40f7-a9ce-340524ad8573" (UID: "7da7ce8b-402a-40f7-a9ce-340524ad8573"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.909645 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da7ce8b-402a-40f7-a9ce-340524ad8573-kube-api-access-flbhf" (OuterVolumeSpecName: "kube-api-access-flbhf") pod "7da7ce8b-402a-40f7-a9ce-340524ad8573" (UID: "7da7ce8b-402a-40f7-a9ce-340524ad8573"). InnerVolumeSpecName "kube-api-access-flbhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.909923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7da7ce8b-402a-40f7-a9ce-340524ad8573-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7da7ce8b-402a-40f7-a9ce-340524ad8573" (UID: "7da7ce8b-402a-40f7-a9ce-340524ad8573"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.931344 4775 generic.go:334] "Generic (PLEG): container finished" podID="1882b1c5-2465-40df-9306-e5792e0d9f2f" containerID="ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b" exitCode=0 Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.931422 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.931433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" event={"ID":"1882b1c5-2465-40df-9306-e5792e0d9f2f","Type":"ContainerDied","Data":"ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b"} Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.931534 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z" event={"ID":"1882b1c5-2465-40df-9306-e5792e0d9f2f","Type":"ContainerDied","Data":"1a6793f0695cc3d26a11dee78d3b1e2f638d464d822526dcc3f7aa8f363599f1"} Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.931553 4775 scope.go:117] "RemoveContainer" containerID="ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.932608 4775 generic.go:334] "Generic (PLEG): container finished" podID="7da7ce8b-402a-40f7-a9ce-340524ad8573" containerID="b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f" exitCode=0 Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.932630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" event={"ID":"7da7ce8b-402a-40f7-a9ce-340524ad8573","Type":"ContainerDied","Data":"b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f"} Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.932647 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.932651 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f88f547f-kjxrc" event={"ID":"7da7ce8b-402a-40f7-a9ce-340524ad8573","Type":"ContainerDied","Data":"2e2beab81a4014c294e6052e86efc7d9c966045a25b61fae9abde1855c0e91eb"} Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.945315 4775 scope.go:117] "RemoveContainer" containerID="ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b" Mar 21 04:53:34 crc kubenswrapper[4775]: E0321 04:53:34.945748 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b\": container with ID starting with ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b not found: ID does not exist" containerID="ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.945792 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b"} err="failed to get container status \"ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b\": rpc error: code = NotFound desc = could not find container \"ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b\": container with ID starting with ae81d0a5c5add9cc57df7dd09c08b1e07e8506e8be71c91d0656558f8529bd3b not found: ID does not exist" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.945817 4775 scope.go:117] "RemoveContainer" containerID="b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.965411 4775 scope.go:117] "RemoveContainer" containerID="b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f" Mar 21 04:53:34 crc kubenswrapper[4775]: E0321 04:53:34.965810 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f\": container with ID starting with b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f not found: ID does not exist" containerID="b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.965846 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f"} err="failed to get container status \"b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f\": rpc error: code = NotFound desc = could not find container \"b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f\": container with ID starting with b57b29e11c242340693c81affc2d61431477f8ba75c091165a111504297edd4f not found: ID does not exist" Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.967655 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f88f547f-kjxrc"] Mar 21 04:53:34 crc kubenswrapper[4775]: I0321 04:53:34.974387 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f88f547f-kjxrc"] Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005005 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65v5z\" (UniqueName: \"kubernetes.io/projected/1882b1c5-2465-40df-9306-e5792e0d9f2f-kube-api-access-65v5z\") pod \"1882b1c5-2465-40df-9306-e5792e0d9f2f\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005050 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-client-ca\") pod \"1882b1c5-2465-40df-9306-e5792e0d9f2f\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005090 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-config\") pod \"1882b1c5-2465-40df-9306-e5792e0d9f2f\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1882b1c5-2465-40df-9306-e5792e0d9f2f-serving-cert\") pod \"1882b1c5-2465-40df-9306-e5792e0d9f2f\" (UID: \"1882b1c5-2465-40df-9306-e5792e0d9f2f\") " Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005480 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7da7ce8b-402a-40f7-a9ce-340524ad8573-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005498 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005509 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005523 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flbhf\" (UniqueName: \"kubernetes.io/projected/7da7ce8b-402a-40f7-a9ce-340524ad8573-kube-api-access-flbhf\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005536 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7da7ce8b-402a-40f7-a9ce-340524ad8573-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.005795 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-client-ca" (OuterVolumeSpecName: "client-ca") pod "1882b1c5-2465-40df-9306-e5792e0d9f2f" (UID: "1882b1c5-2465-40df-9306-e5792e0d9f2f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.006177 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-config" (OuterVolumeSpecName: "config") pod "1882b1c5-2465-40df-9306-e5792e0d9f2f" (UID: "1882b1c5-2465-40df-9306-e5792e0d9f2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.007837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1882b1c5-2465-40df-9306-e5792e0d9f2f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1882b1c5-2465-40df-9306-e5792e0d9f2f" (UID: "1882b1c5-2465-40df-9306-e5792e0d9f2f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.007961 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1882b1c5-2465-40df-9306-e5792e0d9f2f-kube-api-access-65v5z" (OuterVolumeSpecName: "kube-api-access-65v5z") pod "1882b1c5-2465-40df-9306-e5792e0d9f2f" (UID: "1882b1c5-2465-40df-9306-e5792e0d9f2f"). InnerVolumeSpecName "kube-api-access-65v5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.106407 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65v5z\" (UniqueName: \"kubernetes.io/projected/1882b1c5-2465-40df-9306-e5792e0d9f2f-kube-api-access-65v5z\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.106493 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.106516 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1882b1c5-2465-40df-9306-e5792e0d9f2f-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.106536 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1882b1c5-2465-40df-9306-e5792e0d9f2f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.263082 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z"] Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.266087 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6c87d469-rgj5z"] Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.668039 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1882b1c5-2465-40df-9306-e5792e0d9f2f" path="/var/lib/kubelet/pods/1882b1c5-2465-40df-9306-e5792e0d9f2f/volumes" Mar 21 04:53:35 crc kubenswrapper[4775]: I0321 04:53:35.668746 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da7ce8b-402a-40f7-a9ce-340524ad8573" path="/var/lib/kubelet/pods/7da7ce8b-402a-40f7-a9ce-340524ad8573/volumes" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.021270 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-snbl7"] Mar 21 04:53:36 crc kubenswrapper[4775]: E0321 04:53:36.021534 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da7ce8b-402a-40f7-a9ce-340524ad8573" containerName="controller-manager" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.021549 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da7ce8b-402a-40f7-a9ce-340524ad8573" containerName="controller-manager" Mar 21 04:53:36 crc kubenswrapper[4775]: E0321 04:53:36.021561 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.021567 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:53:36 crc kubenswrapper[4775]: E0321 04:53:36.021580 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1882b1c5-2465-40df-9306-e5792e0d9f2f" containerName="route-controller-manager" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.021588 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1882b1c5-2465-40df-9306-e5792e0d9f2f" containerName="route-controller-manager" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.021698 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da7ce8b-402a-40f7-a9ce-340524ad8573" containerName="controller-manager" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.021717 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.021724 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1882b1c5-2465-40df-9306-e5792e0d9f2f" containerName="route-controller-manager" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.022049 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.025438 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.025550 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.025644 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.025654 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.026862 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.027193 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.029731 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq"] Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.030454 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.032839 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.033222 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.033400 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.033454 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.033620 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.033704 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.034369 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq"] Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.036908 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.044676 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-snbl7"] Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.117731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-proxy-ca-bundles\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.117778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-config\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.117803 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxv79\" (UniqueName: \"kubernetes.io/projected/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-kube-api-access-mxv79\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.117825 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-client-ca\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.117945 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-serving-cert\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.118021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-config\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.118055 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d4e02d-a2fa-4465-936b-df5f7ef22086-serving-cert\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.118079 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67jrx\" (UniqueName: \"kubernetes.io/projected/28d4e02d-a2fa-4465-936b-df5f7ef22086-kube-api-access-67jrx\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.118102 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-client-ca\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.219790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-proxy-ca-bundles\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.219841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-config\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.219863 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxv79\" (UniqueName: \"kubernetes.io/projected/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-kube-api-access-mxv79\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.219885 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-client-ca\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.219910 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-serving-cert\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.219943 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-config\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.219984 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d4e02d-a2fa-4465-936b-df5f7ef22086-serving-cert\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.220019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67jrx\" (UniqueName: \"kubernetes.io/projected/28d4e02d-a2fa-4465-936b-df5f7ef22086-kube-api-access-67jrx\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.220045 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-client-ca\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.221217 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-client-ca\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.221283 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-client-ca\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.221403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-proxy-ca-bundles\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.221505 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-config\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.222501 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-config\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.224050 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-serving-cert\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.232665 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d4e02d-a2fa-4465-936b-df5f7ef22086-serving-cert\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.236331 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxv79\" (UniqueName: \"kubernetes.io/projected/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-kube-api-access-mxv79\") pod \"route-controller-manager-555c6b4d6b-jh8tq\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.238865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67jrx\" (UniqueName: \"kubernetes.io/projected/28d4e02d-a2fa-4465-936b-df5f7ef22086-kube-api-access-67jrx\") pod \"controller-manager-5b5b8879f4-snbl7\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.347660 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.363158 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.768144 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-snbl7"] Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.796990 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq"] Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.952924 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" event={"ID":"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb","Type":"ContainerStarted","Data":"56c232d6ac9969429c18b124a338e1b6499df231f55af06b4690d442d2b765b2"} Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.954907 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" event={"ID":"28d4e02d-a2fa-4465-936b-df5f7ef22086","Type":"ContainerStarted","Data":"87509b0780118b1a8a1bce50fbeee0c0687a23effaec2af6b41e83b608d4b8d9"} Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.955164 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.957092 4775 patch_prober.go:28] interesting pod/controller-manager-5b5b8879f4-snbl7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.957159 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" podUID="28d4e02d-a2fa-4465-936b-df5f7ef22086" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Mar 21 04:53:36 crc kubenswrapper[4775]: I0321 04:53:36.975234 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" podStartSLOduration=2.9752173859999997 podStartE2EDuration="2.975217386s" podCreationTimestamp="2026-03-21 04:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:53:36.970920411 +0000 UTC m=+369.947384035" watchObservedRunningTime="2026-03-21 04:53:36.975217386 +0000 UTC m=+369.951681010" Mar 21 04:53:37 crc kubenswrapper[4775]: I0321 04:53:37.961626 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" event={"ID":"28d4e02d-a2fa-4465-936b-df5f7ef22086","Type":"ContainerStarted","Data":"20c1b7542354b374133f523fc8f60ffcbd5359eb5d860eb5669e6229862d91e5"} Mar 21 04:53:37 crc kubenswrapper[4775]: I0321 04:53:37.963671 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" event={"ID":"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb","Type":"ContainerStarted","Data":"2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8"} Mar 21 04:53:37 crc kubenswrapper[4775]: I0321 04:53:37.964022 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:37 crc kubenswrapper[4775]: I0321 04:53:37.966371 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:37 crc kubenswrapper[4775]: I0321 04:53:37.968178 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:37 crc kubenswrapper[4775]: I0321 04:53:37.979831 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" podStartSLOduration=3.979815002 podStartE2EDuration="3.979815002s" podCreationTimestamp="2026-03-21 04:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:53:37.97835616 +0000 UTC m=+370.954819784" watchObservedRunningTime="2026-03-21 04:53:37.979815002 +0000 UTC m=+370.956278626" Mar 21 04:53:43 crc kubenswrapper[4775]: I0321 04:53:43.997663 4775 generic.go:334] "Generic (PLEG): container finished" podID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerID="e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f" exitCode=0 Mar 21 04:53:43 crc kubenswrapper[4775]: I0321 04:53:43.997728 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" event={"ID":"dfbaac71-f99c-4373-a469-f2e5dd0ee632","Type":"ContainerDied","Data":"e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f"} Mar 21 04:53:43 crc kubenswrapper[4775]: I0321 04:53:43.999221 4775 scope.go:117] "RemoveContainer" containerID="e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f" Mar 21 04:53:45 crc kubenswrapper[4775]: I0321 04:53:45.013139 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" event={"ID":"dfbaac71-f99c-4373-a469-f2e5dd0ee632","Type":"ContainerStarted","Data":"53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2"} Mar 21 04:53:45 crc kubenswrapper[4775]: I0321 04:53:45.013779 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:53:45 crc kubenswrapper[4775]: I0321 04:53:45.017792 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:53:54 crc kubenswrapper[4775]: I0321 04:53:54.949272 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-snbl7"] Mar 21 04:53:54 crc kubenswrapper[4775]: I0321 04:53:54.949973 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" podUID="28d4e02d-a2fa-4465-936b-df5f7ef22086" containerName="controller-manager" containerID="cri-o://20c1b7542354b374133f523fc8f60ffcbd5359eb5d860eb5669e6229862d91e5" gracePeriod=30 Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.029870 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq"] Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.030276 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" podUID="e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" containerName="route-controller-manager" containerID="cri-o://2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8" gracePeriod=30 Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.070399 4775 generic.go:334] "Generic (PLEG): container finished" podID="28d4e02d-a2fa-4465-936b-df5f7ef22086" containerID="20c1b7542354b374133f523fc8f60ffcbd5359eb5d860eb5669e6229862d91e5" exitCode=0 Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.070437 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" event={"ID":"28d4e02d-a2fa-4465-936b-df5f7ef22086","Type":"ContainerDied","Data":"20c1b7542354b374133f523fc8f60ffcbd5359eb5d860eb5669e6229862d91e5"} Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.478187 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.537186 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.555456 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-client-ca\") pod \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.555501 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-serving-cert\") pod \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.555538 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxv79\" (UniqueName: \"kubernetes.io/projected/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-kube-api-access-mxv79\") pod \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.555570 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-config\") pod \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\" (UID: \"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.556333 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" (UID: "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.556345 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-config" (OuterVolumeSpecName: "config") pod "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" (UID: "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.556558 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.556576 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.563006 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-kube-api-access-mxv79" (OuterVolumeSpecName: "kube-api-access-mxv79") pod "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" (UID: "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb"). InnerVolumeSpecName "kube-api-access-mxv79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.563349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" (UID: "e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657073 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-config\") pod \"28d4e02d-a2fa-4465-936b-df5f7ef22086\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-proxy-ca-bundles\") pod \"28d4e02d-a2fa-4465-936b-df5f7ef22086\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657159 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-client-ca\") pod \"28d4e02d-a2fa-4465-936b-df5f7ef22086\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657226 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d4e02d-a2fa-4465-936b-df5f7ef22086-serving-cert\") pod \"28d4e02d-a2fa-4465-936b-df5f7ef22086\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657263 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67jrx\" (UniqueName: \"kubernetes.io/projected/28d4e02d-a2fa-4465-936b-df5f7ef22086-kube-api-access-67jrx\") pod \"28d4e02d-a2fa-4465-936b-df5f7ef22086\" (UID: \"28d4e02d-a2fa-4465-936b-df5f7ef22086\") " Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657540 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657557 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxv79\" (UniqueName: \"kubernetes.io/projected/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb-kube-api-access-mxv79\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657875 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "28d4e02d-a2fa-4465-936b-df5f7ef22086" (UID: "28d4e02d-a2fa-4465-936b-df5f7ef22086"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.657958 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-config" (OuterVolumeSpecName: "config") pod "28d4e02d-a2fa-4465-936b-df5f7ef22086" (UID: "28d4e02d-a2fa-4465-936b-df5f7ef22086"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.658513 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-client-ca" (OuterVolumeSpecName: "client-ca") pod "28d4e02d-a2fa-4465-936b-df5f7ef22086" (UID: "28d4e02d-a2fa-4465-936b-df5f7ef22086"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.660065 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d4e02d-a2fa-4465-936b-df5f7ef22086-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28d4e02d-a2fa-4465-936b-df5f7ef22086" (UID: "28d4e02d-a2fa-4465-936b-df5f7ef22086"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.660262 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d4e02d-a2fa-4465-936b-df5f7ef22086-kube-api-access-67jrx" (OuterVolumeSpecName: "kube-api-access-67jrx") pod "28d4e02d-a2fa-4465-936b-df5f7ef22086" (UID: "28d4e02d-a2fa-4465-936b-df5f7ef22086"). InnerVolumeSpecName "kube-api-access-67jrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.759476 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d4e02d-a2fa-4465-936b-df5f7ef22086-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.759529 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67jrx\" (UniqueName: \"kubernetes.io/projected/28d4e02d-a2fa-4465-936b-df5f7ef22086-kube-api-access-67jrx\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.759550 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.759567 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:55 crc kubenswrapper[4775]: I0321 04:53:55.759586 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28d4e02d-a2fa-4465-936b-df5f7ef22086-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.033519 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77fc8489b9-p4xpt"] Mar 21 04:53:56 crc kubenswrapper[4775]: E0321 04:53:56.033705 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" containerName="route-controller-manager" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.033717 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" containerName="route-controller-manager" Mar 21 04:53:56 crc kubenswrapper[4775]: E0321 04:53:56.033732 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d4e02d-a2fa-4465-936b-df5f7ef22086" containerName="controller-manager" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.033740 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d4e02d-a2fa-4465-936b-df5f7ef22086" containerName="controller-manager" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.033834 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" containerName="route-controller-manager" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.033846 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d4e02d-a2fa-4465-936b-df5f7ef22086" containerName="controller-manager" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.034202 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.043314 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77fc8489b9-p4xpt"] Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.079975 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.079961 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5b8879f4-snbl7" event={"ID":"28d4e02d-a2fa-4465-936b-df5f7ef22086","Type":"ContainerDied","Data":"87509b0780118b1a8a1bce50fbeee0c0687a23effaec2af6b41e83b608d4b8d9"} Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.080196 4775 scope.go:117] "RemoveContainer" containerID="20c1b7542354b374133f523fc8f60ffcbd5359eb5d860eb5669e6229862d91e5" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.081591 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" containerID="2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8" exitCode=0 Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.081640 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.081634 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" event={"ID":"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb","Type":"ContainerDied","Data":"2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8"} Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.081766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq" event={"ID":"e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb","Type":"ContainerDied","Data":"56c232d6ac9969429c18b124a338e1b6499df231f55af06b4690d442d2b765b2"} Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.101186 4775 scope.go:117] "RemoveContainer" containerID="2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.116023 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-snbl7"] Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.117857 4775 scope.go:117] "RemoveContainer" containerID="2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8" Mar 21 04:53:56 crc kubenswrapper[4775]: E0321 04:53:56.118388 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8\": container with ID starting with 2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8 not found: ID does not exist" containerID="2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.118440 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8"} err="failed to get container status \"2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8\": rpc error: code = NotFound desc = could not find container \"2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8\": container with ID starting with 2115ff11bda799954a12857b7ea38cd8c929ef48a09eaaeb6e02765150d49cb8 not found: ID does not exist" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.120344 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-snbl7"] Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.130225 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq"] Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.133303 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-jh8tq"] Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.164364 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41534621-ccd0-4e69-a13c-e6828feebe17-serving-cert\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.164422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-config\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.164448 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-client-ca\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.164486 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48s5\" (UniqueName: \"kubernetes.io/projected/41534621-ccd0-4e69-a13c-e6828feebe17-kube-api-access-s48s5\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.164520 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-proxy-ca-bundles\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.265520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41534621-ccd0-4e69-a13c-e6828feebe17-serving-cert\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.265695 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-config\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.265752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-client-ca\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.265796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48s5\" (UniqueName: \"kubernetes.io/projected/41534621-ccd0-4e69-a13c-e6828feebe17-kube-api-access-s48s5\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.265860 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-proxy-ca-bundles\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.266825 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-client-ca\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.267451 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-config\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.267540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-proxy-ca-bundles\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.271130 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41534621-ccd0-4e69-a13c-e6828feebe17-serving-cert\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.282681 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48s5\" (UniqueName: \"kubernetes.io/projected/41534621-ccd0-4e69-a13c-e6828feebe17-kube-api-access-s48s5\") pod \"controller-manager-77fc8489b9-p4xpt\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.347434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:56 crc kubenswrapper[4775]: I0321 04:53:56.746617 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77fc8489b9-p4xpt"] Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.034746 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv"] Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.035481 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.036824 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.037186 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.037551 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.037856 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.038153 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.041421 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.045512 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv"] Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.088695 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" event={"ID":"41534621-ccd0-4e69-a13c-e6828feebe17","Type":"ContainerStarted","Data":"5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94"} Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.088763 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" event={"ID":"41534621-ccd0-4e69-a13c-e6828feebe17","Type":"ContainerStarted","Data":"69122f1ec9bc520b92bd94027f2fae1d2dd0387607089568975168c52c43b3a7"} Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.089202 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.103788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.112629 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" podStartSLOduration=3.112613185 podStartE2EDuration="3.112613185s" podCreationTimestamp="2026-03-21 04:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:53:57.109787113 +0000 UTC m=+390.086250757" watchObservedRunningTime="2026-03-21 04:53:57.112613185 +0000 UTC m=+390.089076809" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.180911 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4dwd\" (UniqueName: \"kubernetes.io/projected/951da48c-671e-4004-85f0-2137949a3a2d-kube-api-access-d4dwd\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.181279 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-config\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.181368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-client-ca\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.181388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/951da48c-671e-4004-85f0-2137949a3a2d-serving-cert\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.282190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-client-ca\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.282247 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/951da48c-671e-4004-85f0-2137949a3a2d-serving-cert\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.282293 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4dwd\" (UniqueName: \"kubernetes.io/projected/951da48c-671e-4004-85f0-2137949a3a2d-kube-api-access-d4dwd\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.282332 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-config\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.283447 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-client-ca\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.283541 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-config\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.287352 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/951da48c-671e-4004-85f0-2137949a3a2d-serving-cert\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.298316 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4dwd\" (UniqueName: \"kubernetes.io/projected/951da48c-671e-4004-85f0-2137949a3a2d-kube-api-access-d4dwd\") pod \"route-controller-manager-7d78b4c7b9-244bv\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.356687 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.669374 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d4e02d-a2fa-4465-936b-df5f7ef22086" path="/var/lib/kubelet/pods/28d4e02d-a2fa-4465-936b-df5f7ef22086/volumes" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.670154 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb" path="/var/lib/kubelet/pods/e3e7cd58-51b5-4a7f-b374-fe94b9b0c1eb/volumes" Mar 21 04:53:57 crc kubenswrapper[4775]: I0321 04:53:57.793948 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv"] Mar 21 04:53:57 crc kubenswrapper[4775]: W0321 04:53:57.800337 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod951da48c_671e_4004_85f0_2137949a3a2d.slice/crio-aa646cd6ba5a44de57d221735dcb1de71958eda8b620088cc7ef6124aed0413e WatchSource:0}: Error finding container aa646cd6ba5a44de57d221735dcb1de71958eda8b620088cc7ef6124aed0413e: Status 404 returned error can't find the container with id aa646cd6ba5a44de57d221735dcb1de71958eda8b620088cc7ef6124aed0413e Mar 21 04:53:58 crc kubenswrapper[4775]: I0321 04:53:58.099316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" event={"ID":"951da48c-671e-4004-85f0-2137949a3a2d","Type":"ContainerStarted","Data":"3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a"} Mar 21 04:53:58 crc kubenswrapper[4775]: I0321 04:53:58.099383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" event={"ID":"951da48c-671e-4004-85f0-2137949a3a2d","Type":"ContainerStarted","Data":"aa646cd6ba5a44de57d221735dcb1de71958eda8b620088cc7ef6124aed0413e"} Mar 21 04:53:58 crc kubenswrapper[4775]: I0321 04:53:58.099547 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:58 crc kubenswrapper[4775]: I0321 04:53:58.335779 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:53:58 crc kubenswrapper[4775]: I0321 04:53:58.355958 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" podStartSLOduration=3.355935454 podStartE2EDuration="3.355935454s" podCreationTimestamp="2026-03-21 04:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:53:58.116677296 +0000 UTC m=+391.093140920" watchObservedRunningTime="2026-03-21 04:53:58.355935454 +0000 UTC m=+391.332399078" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.158587 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567814-fvs8k"] Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.160001 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-fvs8k" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.222325 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.223956 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.224505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.234168 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-fvs8k"] Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.323080 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfgpl\" (UniqueName: \"kubernetes.io/projected/e8f9e3cb-7583-4349-98dc-422a7880fe5e-kube-api-access-gfgpl\") pod \"auto-csr-approver-29567814-fvs8k\" (UID: \"e8f9e3cb-7583-4349-98dc-422a7880fe5e\") " pod="openshift-infra/auto-csr-approver-29567814-fvs8k" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.423861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfgpl\" (UniqueName: \"kubernetes.io/projected/e8f9e3cb-7583-4349-98dc-422a7880fe5e-kube-api-access-gfgpl\") pod \"auto-csr-approver-29567814-fvs8k\" (UID: \"e8f9e3cb-7583-4349-98dc-422a7880fe5e\") " pod="openshift-infra/auto-csr-approver-29567814-fvs8k" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.443669 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfgpl\" (UniqueName: \"kubernetes.io/projected/e8f9e3cb-7583-4349-98dc-422a7880fe5e-kube-api-access-gfgpl\") pod \"auto-csr-approver-29567814-fvs8k\" (UID: \"e8f9e3cb-7583-4349-98dc-422a7880fe5e\") " pod="openshift-infra/auto-csr-approver-29567814-fvs8k" Mar 21 04:54:00 crc kubenswrapper[4775]: I0321 04:54:00.573000 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-fvs8k" Mar 21 04:54:01 crc kubenswrapper[4775]: I0321 04:54:01.004888 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-fvs8k"] Mar 21 04:54:01 crc kubenswrapper[4775]: I0321 04:54:01.118787 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-fvs8k" event={"ID":"e8f9e3cb-7583-4349-98dc-422a7880fe5e","Type":"ContainerStarted","Data":"bef13ceeba8d06c136483de90c1fc8ff328392147f003ed93dfc860c9a4863ee"} Mar 21 04:54:03 crc kubenswrapper[4775]: I0321 04:54:03.131852 4775 generic.go:334] "Generic (PLEG): container finished" podID="e8f9e3cb-7583-4349-98dc-422a7880fe5e" containerID="55e31feec3221e168e396a1b14aac3afcd926bdfa911a175f057a586cb60827a" exitCode=0 Mar 21 04:54:03 crc kubenswrapper[4775]: I0321 04:54:03.131903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-fvs8k" event={"ID":"e8f9e3cb-7583-4349-98dc-422a7880fe5e","Type":"ContainerDied","Data":"55e31feec3221e168e396a1b14aac3afcd926bdfa911a175f057a586cb60827a"} Mar 21 04:54:04 crc kubenswrapper[4775]: I0321 04:54:04.514076 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-fvs8k" Mar 21 04:54:04 crc kubenswrapper[4775]: I0321 04:54:04.530654 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svf9w"] Mar 21 04:54:04 crc kubenswrapper[4775]: I0321 04:54:04.530870 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-svf9w" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="registry-server" containerID="cri-o://924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a" gracePeriod=2 Mar 21 04:54:04 crc kubenswrapper[4775]: I0321 04:54:04.687309 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfgpl\" (UniqueName: \"kubernetes.io/projected/e8f9e3cb-7583-4349-98dc-422a7880fe5e-kube-api-access-gfgpl\") pod \"e8f9e3cb-7583-4349-98dc-422a7880fe5e\" (UID: \"e8f9e3cb-7583-4349-98dc-422a7880fe5e\") " Mar 21 04:54:04 crc kubenswrapper[4775]: I0321 04:54:04.696309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f9e3cb-7583-4349-98dc-422a7880fe5e-kube-api-access-gfgpl" (OuterVolumeSpecName: "kube-api-access-gfgpl") pod "e8f9e3cb-7583-4349-98dc-422a7880fe5e" (UID: "e8f9e3cb-7583-4349-98dc-422a7880fe5e"). InnerVolumeSpecName "kube-api-access-gfgpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:04 crc kubenswrapper[4775]: I0321 04:54:04.789679 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfgpl\" (UniqueName: \"kubernetes.io/projected/e8f9e3cb-7583-4349-98dc-422a7880fe5e-kube-api-access-gfgpl\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:04 crc kubenswrapper[4775]: I0321 04:54:04.963536 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.092914 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-utilities\") pod \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.093001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-catalog-content\") pod \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.093095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wq48\" (UniqueName: \"kubernetes.io/projected/1dcbca72-150a-47c6-ac3c-f701ae82e05b-kube-api-access-4wq48\") pod \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\" (UID: \"1dcbca72-150a-47c6-ac3c-f701ae82e05b\") " Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.093990 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-utilities" (OuterVolumeSpecName: "utilities") pod "1dcbca72-150a-47c6-ac3c-f701ae82e05b" (UID: "1dcbca72-150a-47c6-ac3c-f701ae82e05b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.096748 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcbca72-150a-47c6-ac3c-f701ae82e05b-kube-api-access-4wq48" (OuterVolumeSpecName: "kube-api-access-4wq48") pod "1dcbca72-150a-47c6-ac3c-f701ae82e05b" (UID: "1dcbca72-150a-47c6-ac3c-f701ae82e05b"). InnerVolumeSpecName "kube-api-access-4wq48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.143655 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dcbca72-150a-47c6-ac3c-f701ae82e05b" (UID: "1dcbca72-150a-47c6-ac3c-f701ae82e05b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.145111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-fvs8k" event={"ID":"e8f9e3cb-7583-4349-98dc-422a7880fe5e","Type":"ContainerDied","Data":"bef13ceeba8d06c136483de90c1fc8ff328392147f003ed93dfc860c9a4863ee"} Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.145190 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-fvs8k" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.145209 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bef13ceeba8d06c136483de90c1fc8ff328392147f003ed93dfc860c9a4863ee" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.148084 4775 generic.go:334] "Generic (PLEG): container finished" podID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerID="924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a" exitCode=0 Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.148272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svf9w" event={"ID":"1dcbca72-150a-47c6-ac3c-f701ae82e05b","Type":"ContainerDied","Data":"924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a"} Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.148313 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-svf9w" event={"ID":"1dcbca72-150a-47c6-ac3c-f701ae82e05b","Type":"ContainerDied","Data":"e49106633d57c504f2a2b545d76a34559e4299f9b5c7c912c88ab2862f4e055a"} Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.148332 4775 scope.go:117] "RemoveContainer" containerID="924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.148440 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-svf9w" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.163868 4775 scope.go:117] "RemoveContainer" containerID="a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.185350 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-svf9w"] Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.187191 4775 scope.go:117] "RemoveContainer" containerID="00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.193955 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wq48\" (UniqueName: \"kubernetes.io/projected/1dcbca72-150a-47c6-ac3c-f701ae82e05b-kube-api-access-4wq48\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.194021 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.194033 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcbca72-150a-47c6-ac3c-f701ae82e05b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.198083 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-svf9w"] Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.201732 4775 scope.go:117] "RemoveContainer" containerID="924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a" Mar 21 04:54:05 crc kubenswrapper[4775]: E0321 04:54:05.204795 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a\": container with ID starting with 924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a not found: ID does not exist" containerID="924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.204865 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a"} err="failed to get container status \"924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a\": rpc error: code = NotFound desc = could not find container \"924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a\": container with ID starting with 924b8554190993f0392bd8c20440e982df121b27ea4559dd16429c3faaecbb7a not found: ID does not exist" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.204910 4775 scope.go:117] "RemoveContainer" containerID="a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1" Mar 21 04:54:05 crc kubenswrapper[4775]: E0321 04:54:05.205279 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1\": container with ID starting with a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1 not found: ID does not exist" containerID="a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.205311 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1"} err="failed to get container status \"a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1\": rpc error: code = NotFound desc = could not find container \"a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1\": container with ID starting with a41e12fdcec460e076316274065e17d2ca1fc9dae7706fccd22acc387dfd57b1 not found: ID does not exist" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.205336 4775 scope.go:117] "RemoveContainer" containerID="00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc" Mar 21 04:54:05 crc kubenswrapper[4775]: E0321 04:54:05.205664 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc\": container with ID starting with 00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc not found: ID does not exist" containerID="00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.205698 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc"} err="failed to get container status \"00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc\": rpc error: code = NotFound desc = could not find container \"00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc\": container with ID starting with 00e2e82d9b942bac6084d281d0485867290448469dd14201683721f46ea863fc not found: ID does not exist" Mar 21 04:54:05 crc kubenswrapper[4775]: I0321 04:54:05.667182 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" path="/var/lib/kubelet/pods/1dcbca72-150a-47c6-ac3c-f701ae82e05b/volumes" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.731368 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7k9df"] Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.731698 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7k9df" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="registry-server" containerID="cri-o://e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402" gracePeriod=2 Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.881398 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4lpkr"] Mar 21 04:54:06 crc kubenswrapper[4775]: E0321 04:54:06.881662 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="extract-utilities" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.881679 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="extract-utilities" Mar 21 04:54:06 crc kubenswrapper[4775]: E0321 04:54:06.881695 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f9e3cb-7583-4349-98dc-422a7880fe5e" containerName="oc" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.881702 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f9e3cb-7583-4349-98dc-422a7880fe5e" containerName="oc" Mar 21 04:54:06 crc kubenswrapper[4775]: E0321 04:54:06.881712 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="registry-server" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.881719 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="registry-server" Mar 21 04:54:06 crc kubenswrapper[4775]: E0321 04:54:06.881732 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="extract-content" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.881740 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="extract-content" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.881860 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f9e3cb-7583-4349-98dc-422a7880fe5e" containerName="oc" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.881876 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcbca72-150a-47c6-ac3c-f701ae82e05b" containerName="registry-server" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.886047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:06 crc kubenswrapper[4775]: I0321 04:54:06.904848 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4lpkr"] Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-registry-tls\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/946ed25f-3a5b-446c-a928-2a85e719ff89-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015665 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/946ed25f-3a5b-446c-a928-2a85e719ff89-registry-certificates\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015683 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/946ed25f-3a5b-446c-a928-2a85e719ff89-trusted-ca\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-bound-sa-token\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015837 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbvt\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-kube-api-access-7pbvt\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.015867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/946ed25f-3a5b-446c-a928-2a85e719ff89-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.039926 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.116455 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/946ed25f-3a5b-446c-a928-2a85e719ff89-registry-certificates\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.116512 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/946ed25f-3a5b-446c-a928-2a85e719ff89-trusted-ca\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.116540 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-bound-sa-token\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.116591 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbvt\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-kube-api-access-7pbvt\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.116621 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/946ed25f-3a5b-446c-a928-2a85e719ff89-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.116657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-registry-tls\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.116694 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/946ed25f-3a5b-446c-a928-2a85e719ff89-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.117004 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/946ed25f-3a5b-446c-a928-2a85e719ff89-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.117697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/946ed25f-3a5b-446c-a928-2a85e719ff89-registry-certificates\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.118672 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/946ed25f-3a5b-446c-a928-2a85e719ff89-trusted-ca\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.124951 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-registry-tls\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.129325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/946ed25f-3a5b-446c-a928-2a85e719ff89-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.132951 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-bound-sa-token\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.133374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbvt\" (UniqueName: \"kubernetes.io/projected/946ed25f-3a5b-446c-a928-2a85e719ff89-kube-api-access-7pbvt\") pod \"image-registry-66df7c8f76-4lpkr\" (UID: \"946ed25f-3a5b-446c-a928-2a85e719ff89\") " pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: E0321 04:54:07.165029 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402 is running failed: container process not found" containerID="e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402" cmd=["grpc_health_probe","-addr=:50051"] Mar 21 04:54:07 crc kubenswrapper[4775]: E0321 04:54:07.165548 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402 is running failed: container process not found" containerID="e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402" cmd=["grpc_health_probe","-addr=:50051"] Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.165679 4775 generic.go:334] "Generic (PLEG): container finished" podID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerID="e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402" exitCode=0 Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.165705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7k9df" event={"ID":"55e2733e-3620-4ecb-a51e-33b1fd3dce9d","Type":"ContainerDied","Data":"e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402"} Mar 21 04:54:07 crc kubenswrapper[4775]: E0321 04:54:07.165860 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402 is running failed: container process not found" containerID="e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402" cmd=["grpc_health_probe","-addr=:50051"] Mar 21 04:54:07 crc kubenswrapper[4775]: E0321 04:54:07.165892 4775 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-7k9df" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="registry-server" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.192907 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.203541 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.318508 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-utilities\") pod \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.319012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm7zf\" (UniqueName: \"kubernetes.io/projected/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-kube-api-access-mm7zf\") pod \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.319134 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-catalog-content\") pod \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\" (UID: \"55e2733e-3620-4ecb-a51e-33b1fd3dce9d\") " Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.320291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-utilities" (OuterVolumeSpecName: "utilities") pod "55e2733e-3620-4ecb-a51e-33b1fd3dce9d" (UID: "55e2733e-3620-4ecb-a51e-33b1fd3dce9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.322382 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-kube-api-access-mm7zf" (OuterVolumeSpecName: "kube-api-access-mm7zf") pod "55e2733e-3620-4ecb-a51e-33b1fd3dce9d" (UID: "55e2733e-3620-4ecb-a51e-33b1fd3dce9d"). InnerVolumeSpecName "kube-api-access-mm7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.324259 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.324307 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm7zf\" (UniqueName: \"kubernetes.io/projected/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-kube-api-access-mm7zf\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.361526 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55e2733e-3620-4ecb-a51e-33b1fd3dce9d" (UID: "55e2733e-3620-4ecb-a51e-33b1fd3dce9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.425405 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55e2733e-3620-4ecb-a51e-33b1fd3dce9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:07 crc kubenswrapper[4775]: I0321 04:54:07.610749 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4lpkr"] Mar 21 04:54:07 crc kubenswrapper[4775]: W0321 04:54:07.615049 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod946ed25f_3a5b_446c_a928_2a85e719ff89.slice/crio-fa8267be73a89d641d14c54a74088f2114a7394e2ab393f72da796156c111a79 WatchSource:0}: Error finding container fa8267be73a89d641d14c54a74088f2114a7394e2ab393f72da796156c111a79: Status 404 returned error can't find the container with id fa8267be73a89d641d14c54a74088f2114a7394e2ab393f72da796156c111a79 Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.171428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" event={"ID":"946ed25f-3a5b-446c-a928-2a85e719ff89","Type":"ContainerStarted","Data":"89bc38e86d0a705f93814e892412fcdeffe60710cbc9847a1c18608f1666df9d"} Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.171796 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" event={"ID":"946ed25f-3a5b-446c-a928-2a85e719ff89","Type":"ContainerStarted","Data":"fa8267be73a89d641d14c54a74088f2114a7394e2ab393f72da796156c111a79"} Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.171842 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.173641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7k9df" event={"ID":"55e2733e-3620-4ecb-a51e-33b1fd3dce9d","Type":"ContainerDied","Data":"5477663a3a0fc934d7191139776ed4c58017ab46346a0a37602d99108fd8b86f"} Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.173692 4775 scope.go:117] "RemoveContainer" containerID="e5e47a7de8c354ec132dd0116fce768c5a764f807e5342721397e5aed3144402" Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.173886 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7k9df" Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.189078 4775 scope.go:117] "RemoveContainer" containerID="93e802bba3a0a60860befd9b343fb138a5913ba96b3cd2219f5f37eec14db557" Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.195643 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" podStartSLOduration=2.195622179 podStartE2EDuration="2.195622179s" podCreationTimestamp="2026-03-21 04:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:54:08.190647187 +0000 UTC m=+401.167110821" watchObservedRunningTime="2026-03-21 04:54:08.195622179 +0000 UTC m=+401.172085803" Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.202263 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7k9df"] Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.206426 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7k9df"] Mar 21 04:54:08 crc kubenswrapper[4775]: I0321 04:54:08.209428 4775 scope.go:117] "RemoveContainer" containerID="2fca8a116d2bdda5fb8662833d28064b756cf76d00f4ea921c85b01d826cd2be" Mar 21 04:54:09 crc kubenswrapper[4775]: I0321 04:54:09.668798 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" path="/var/lib/kubelet/pods/55e2733e-3620-4ecb-a51e-33b1fd3dce9d/volumes" Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.392275 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77fc8489b9-p4xpt"] Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.393081 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" podUID="41534621-ccd0-4e69-a13c-e6828feebe17" containerName="controller-manager" containerID="cri-o://5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94" gracePeriod=30 Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.406771 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv"] Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.407033 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" podUID="951da48c-671e-4004-85f0-2137949a3a2d" containerName="route-controller-manager" containerID="cri-o://3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a" gracePeriod=30 Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.824536 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.919515 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4dwd\" (UniqueName: \"kubernetes.io/projected/951da48c-671e-4004-85f0-2137949a3a2d-kube-api-access-d4dwd\") pod \"951da48c-671e-4004-85f0-2137949a3a2d\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.919564 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-config\") pod \"951da48c-671e-4004-85f0-2137949a3a2d\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.919623 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/951da48c-671e-4004-85f0-2137949a3a2d-serving-cert\") pod \"951da48c-671e-4004-85f0-2137949a3a2d\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.919676 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-client-ca\") pod \"951da48c-671e-4004-85f0-2137949a3a2d\" (UID: \"951da48c-671e-4004-85f0-2137949a3a2d\") " Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.920610 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-config" (OuterVolumeSpecName: "config") pod "951da48c-671e-4004-85f0-2137949a3a2d" (UID: "951da48c-671e-4004-85f0-2137949a3a2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.921577 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-client-ca" (OuterVolumeSpecName: "client-ca") pod "951da48c-671e-4004-85f0-2137949a3a2d" (UID: "951da48c-671e-4004-85f0-2137949a3a2d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.925359 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/951da48c-671e-4004-85f0-2137949a3a2d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "951da48c-671e-4004-85f0-2137949a3a2d" (UID: "951da48c-671e-4004-85f0-2137949a3a2d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:14 crc kubenswrapper[4775]: I0321 04:54:14.925559 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/951da48c-671e-4004-85f0-2137949a3a2d-kube-api-access-d4dwd" (OuterVolumeSpecName: "kube-api-access-d4dwd") pod "951da48c-671e-4004-85f0-2137949a3a2d" (UID: "951da48c-671e-4004-85f0-2137949a3a2d"). InnerVolumeSpecName "kube-api-access-d4dwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.009818 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.021040 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4dwd\" (UniqueName: \"kubernetes.io/projected/951da48c-671e-4004-85f0-2137949a3a2d-kube-api-access-d4dwd\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.021096 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.021106 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/951da48c-671e-4004-85f0-2137949a3a2d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.021129 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/951da48c-671e-4004-85f0-2137949a3a2d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.121674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-config\") pod \"41534621-ccd0-4e69-a13c-e6828feebe17\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.121750 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41534621-ccd0-4e69-a13c-e6828feebe17-serving-cert\") pod \"41534621-ccd0-4e69-a13c-e6828feebe17\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.121834 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-proxy-ca-bundles\") pod \"41534621-ccd0-4e69-a13c-e6828feebe17\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.121907 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s48s5\" (UniqueName: \"kubernetes.io/projected/41534621-ccd0-4e69-a13c-e6828feebe17-kube-api-access-s48s5\") pod \"41534621-ccd0-4e69-a13c-e6828feebe17\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.122045 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-client-ca\") pod \"41534621-ccd0-4e69-a13c-e6828feebe17\" (UID: \"41534621-ccd0-4e69-a13c-e6828feebe17\") " Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.122689 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-config" (OuterVolumeSpecName: "config") pod "41534621-ccd0-4e69-a13c-e6828feebe17" (UID: "41534621-ccd0-4e69-a13c-e6828feebe17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.122710 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-client-ca" (OuterVolumeSpecName: "client-ca") pod "41534621-ccd0-4e69-a13c-e6828feebe17" (UID: "41534621-ccd0-4e69-a13c-e6828feebe17"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.123425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "41534621-ccd0-4e69-a13c-e6828feebe17" (UID: "41534621-ccd0-4e69-a13c-e6828feebe17"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.125714 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41534621-ccd0-4e69-a13c-e6828feebe17-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "41534621-ccd0-4e69-a13c-e6828feebe17" (UID: "41534621-ccd0-4e69-a13c-e6828feebe17"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.126862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41534621-ccd0-4e69-a13c-e6828feebe17-kube-api-access-s48s5" (OuterVolumeSpecName: "kube-api-access-s48s5") pod "41534621-ccd0-4e69-a13c-e6828feebe17" (UID: "41534621-ccd0-4e69-a13c-e6828feebe17"). InnerVolumeSpecName "kube-api-access-s48s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.208442 4775 generic.go:334] "Generic (PLEG): container finished" podID="951da48c-671e-4004-85f0-2137949a3a2d" containerID="3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a" exitCode=0 Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.208487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" event={"ID":"951da48c-671e-4004-85f0-2137949a3a2d","Type":"ContainerDied","Data":"3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a"} Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.208503 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.208537 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv" event={"ID":"951da48c-671e-4004-85f0-2137949a3a2d","Type":"ContainerDied","Data":"aa646cd6ba5a44de57d221735dcb1de71958eda8b620088cc7ef6124aed0413e"} Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.208562 4775 scope.go:117] "RemoveContainer" containerID="3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.210440 4775 generic.go:334] "Generic (PLEG): container finished" podID="41534621-ccd0-4e69-a13c-e6828feebe17" containerID="5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94" exitCode=0 Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.210474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" event={"ID":"41534621-ccd0-4e69-a13c-e6828feebe17","Type":"ContainerDied","Data":"5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94"} Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.210490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" event={"ID":"41534621-ccd0-4e69-a13c-e6828feebe17","Type":"ContainerDied","Data":"69122f1ec9bc520b92bd94027f2fae1d2dd0387607089568975168c52c43b3a7"} Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.210541 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77fc8489b9-p4xpt" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.223912 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s48s5\" (UniqueName: \"kubernetes.io/projected/41534621-ccd0-4e69-a13c-e6828feebe17-kube-api-access-s48s5\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.223950 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.223962 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.223972 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41534621-ccd0-4e69-a13c-e6828feebe17-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.223983 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41534621-ccd0-4e69-a13c-e6828feebe17-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.225546 4775 scope.go:117] "RemoveContainer" containerID="3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a" Mar 21 04:54:15 crc kubenswrapper[4775]: E0321 04:54:15.226080 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a\": container with ID starting with 3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a not found: ID does not exist" containerID="3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.226244 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a"} err="failed to get container status \"3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a\": rpc error: code = NotFound desc = could not find container \"3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a\": container with ID starting with 3980fc5ac2c84abc585c64be5fe0e20cdc2ee02408d0873f172d592ea567b97a not found: ID does not exist" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.226305 4775 scope.go:117] "RemoveContainer" containerID="5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.255549 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77fc8489b9-p4xpt"] Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.258916 4775 scope.go:117] "RemoveContainer" containerID="5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94" Mar 21 04:54:15 crc kubenswrapper[4775]: E0321 04:54:15.259798 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94\": container with ID starting with 5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94 not found: ID does not exist" containerID="5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.259833 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94"} err="failed to get container status \"5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94\": rpc error: code = NotFound desc = could not find container \"5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94\": container with ID starting with 5104548b211c2e16bc96a01a5e3e4f9c0385383b17796ee059444b9f78c92a94 not found: ID does not exist" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.266321 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77fc8489b9-p4xpt"] Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.271833 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv"] Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.276288 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d78b4c7b9-244bv"] Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.668835 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41534621-ccd0-4e69-a13c-e6828feebe17" path="/var/lib/kubelet/pods/41534621-ccd0-4e69-a13c-e6828feebe17/volumes" Mar 21 04:54:15 crc kubenswrapper[4775]: I0321 04:54:15.669360 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="951da48c-671e-4004-85f0-2137949a3a2d" path="/var/lib/kubelet/pods/951da48c-671e-4004-85f0-2137949a3a2d/volumes" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.047800 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh"] Mar 21 04:54:16 crc kubenswrapper[4775]: E0321 04:54:16.048081 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="extract-content" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048146 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="extract-content" Mar 21 04:54:16 crc kubenswrapper[4775]: E0321 04:54:16.048158 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41534621-ccd0-4e69-a13c-e6828feebe17" containerName="controller-manager" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048165 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="41534621-ccd0-4e69-a13c-e6828feebe17" containerName="controller-manager" Mar 21 04:54:16 crc kubenswrapper[4775]: E0321 04:54:16.048177 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="extract-utilities" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048184 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="extract-utilities" Mar 21 04:54:16 crc kubenswrapper[4775]: E0321 04:54:16.048198 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="951da48c-671e-4004-85f0-2137949a3a2d" containerName="route-controller-manager" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048204 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="951da48c-671e-4004-85f0-2137949a3a2d" containerName="route-controller-manager" Mar 21 04:54:16 crc kubenswrapper[4775]: E0321 04:54:16.048219 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="registry-server" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048226 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="registry-server" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048346 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e2733e-3620-4ecb-a51e-33b1fd3dce9d" containerName="registry-server" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048361 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="41534621-ccd0-4e69-a13c-e6828feebe17" containerName="controller-manager" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048370 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="951da48c-671e-4004-85f0-2137949a3a2d" containerName="route-controller-manager" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.048840 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.051298 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv"] Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.051894 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.057565 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.073137 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.073184 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.073210 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.073342 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.073461 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.073922 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.074057 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.074199 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.074238 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.074460 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.077195 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.077895 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh"] Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.085378 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.091715 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv"] Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-serving-cert\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135451 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b446038-617c-4df4-8bbe-5d821cd3dc27-client-ca\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135488 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngsr\" (UniqueName: \"kubernetes.io/projected/8b446038-617c-4df4-8bbe-5d821cd3dc27-kube-api-access-6ngsr\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135531 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-proxy-ca-bundles\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-config\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135596 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-client-ca\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b446038-617c-4df4-8bbe-5d821cd3dc27-config\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135639 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khn9s\" (UniqueName: \"kubernetes.io/projected/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-kube-api-access-khn9s\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.135665 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b446038-617c-4df4-8bbe-5d821cd3dc27-serving-cert\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236357 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-config\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236411 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-client-ca\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236442 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b446038-617c-4df4-8bbe-5d821cd3dc27-config\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khn9s\" (UniqueName: \"kubernetes.io/projected/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-kube-api-access-khn9s\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236484 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b446038-617c-4df4-8bbe-5d821cd3dc27-serving-cert\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236502 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-serving-cert\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236524 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b446038-617c-4df4-8bbe-5d821cd3dc27-client-ca\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236549 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngsr\" (UniqueName: \"kubernetes.io/projected/8b446038-617c-4df4-8bbe-5d821cd3dc27-kube-api-access-6ngsr\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.236582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-proxy-ca-bundles\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.237424 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-client-ca\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.237658 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-config\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.237777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-proxy-ca-bundles\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.237792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b446038-617c-4df4-8bbe-5d821cd3dc27-config\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.238265 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b446038-617c-4df4-8bbe-5d821cd3dc27-client-ca\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.240783 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-serving-cert\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.242105 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b446038-617c-4df4-8bbe-5d821cd3dc27-serving-cert\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.256433 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khn9s\" (UniqueName: \"kubernetes.io/projected/e0aff99b-e78c-42b5-ac13-8fb27b6c88e8-kube-api-access-khn9s\") pod \"controller-manager-5b5b8879f4-p7rtv\" (UID: \"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8\") " pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.261836 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngsr\" (UniqueName: \"kubernetes.io/projected/8b446038-617c-4df4-8bbe-5d821cd3dc27-kube-api-access-6ngsr\") pod \"route-controller-manager-555c6b4d6b-pkcfh\" (UID: \"8b446038-617c-4df4-8bbe-5d821cd3dc27\") " pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.365640 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.377078 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.693838 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh"] Mar 21 04:54:16 crc kubenswrapper[4775]: I0321 04:54:16.833442 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv"] Mar 21 04:54:16 crc kubenswrapper[4775]: W0321 04:54:16.851676 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0aff99b_e78c_42b5_ac13_8fb27b6c88e8.slice/crio-1fe730cb1afcc8a1047dbfae63ddc7b521943634d7feffb06f13e3de2a8a64cb WatchSource:0}: Error finding container 1fe730cb1afcc8a1047dbfae63ddc7b521943634d7feffb06f13e3de2a8a64cb: Status 404 returned error can't find the container with id 1fe730cb1afcc8a1047dbfae63ddc7b521943634d7feffb06f13e3de2a8a64cb Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.223924 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" event={"ID":"8b446038-617c-4df4-8bbe-5d821cd3dc27","Type":"ContainerStarted","Data":"4aaf40b75759912b5aef27b45aeb115f7739c6aab50515e03ff4a232d96bd5c7"} Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.223977 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" event={"ID":"8b446038-617c-4df4-8bbe-5d821cd3dc27","Type":"ContainerStarted","Data":"3a7d4dcc62dee004c6591691be664dbc82a6d433863cc1859414b3800c88b76f"} Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.225136 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.226606 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" event={"ID":"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8","Type":"ContainerStarted","Data":"1c522f7b9a2336087d2eee63f9397f360e81837d6c247583783be9f284c41a0f"} Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.226643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" event={"ID":"e0aff99b-e78c-42b5-ac13-8fb27b6c88e8","Type":"ContainerStarted","Data":"1fe730cb1afcc8a1047dbfae63ddc7b521943634d7feffb06f13e3de2a8a64cb"} Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.226839 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.231518 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.250411 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" podStartSLOduration=3.250392365 podStartE2EDuration="3.250392365s" podCreationTimestamp="2026-03-21 04:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:54:17.246193404 +0000 UTC m=+410.222657038" watchObservedRunningTime="2026-03-21 04:54:17.250392365 +0000 UTC m=+410.226855979" Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.271810 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b5b8879f4-p7rtv" podStartSLOduration=3.2717912350000002 podStartE2EDuration="3.271791235s" podCreationTimestamp="2026-03-21 04:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:54:17.271190459 +0000 UTC m=+410.247654093" watchObservedRunningTime="2026-03-21 04:54:17.271791235 +0000 UTC m=+410.248254859" Mar 21 04:54:17 crc kubenswrapper[4775]: I0321 04:54:17.277012 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-555c6b4d6b-pkcfh" Mar 21 04:54:27 crc kubenswrapper[4775]: I0321 04:54:27.210826 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4lpkr" Mar 21 04:54:27 crc kubenswrapper[4775]: I0321 04:54:27.271066 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hvd8t"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.425578 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-959kj"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.430039 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-959kj" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="registry-server" containerID="cri-o://aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96" gracePeriod=30 Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.435172 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbtgv"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.443710 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wbtgv" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="registry-server" containerID="cri-o://02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a" gracePeriod=30 Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.444212 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wrzs4"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.444409 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" containerID="cri-o://53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2" gracePeriod=30 Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.456327 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hldm7"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.456609 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hldm7" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="registry-server" containerID="cri-o://bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b" gracePeriod=30 Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.470912 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7fcx"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.471814 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.483533 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjfwd"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.483893 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wjfwd" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="registry-server" containerID="cri-o://dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b" gracePeriod=30 Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.485011 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7fcx"] Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.632019 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmbw8\" (UniqueName: \"kubernetes.io/projected/59fec450-4b61-4a15-b1b5-b47dedd649a0-kube-api-access-zmbw8\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.632095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59fec450-4b61-4a15-b1b5-b47dedd649a0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.632248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/59fec450-4b61-4a15-b1b5-b47dedd649a0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.734082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmbw8\" (UniqueName: \"kubernetes.io/projected/59fec450-4b61-4a15-b1b5-b47dedd649a0-kube-api-access-zmbw8\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.734170 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59fec450-4b61-4a15-b1b5-b47dedd649a0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.734229 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/59fec450-4b61-4a15-b1b5-b47dedd649a0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.735769 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59fec450-4b61-4a15-b1b5-b47dedd649a0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.741771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/59fec450-4b61-4a15-b1b5-b47dedd649a0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.750919 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmbw8\" (UniqueName: \"kubernetes.io/projected/59fec450-4b61-4a15-b1b5-b47dedd649a0-kube-api-access-zmbw8\") pod \"marketplace-operator-79b997595-z7fcx\" (UID: \"59fec450-4b61-4a15-b1b5-b47dedd649a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.800391 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:28 crc kubenswrapper[4775]: I0321 04:54:28.987344 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.139498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-catalog-content\") pod \"ccb910ab-dcef-4523-81df-c0fb5eb83429\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.139567 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-utilities\") pod \"ccb910ab-dcef-4523-81df-c0fb5eb83429\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.139596 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92fzw\" (UniqueName: \"kubernetes.io/projected/ccb910ab-dcef-4523-81df-c0fb5eb83429-kube-api-access-92fzw\") pod \"ccb910ab-dcef-4523-81df-c0fb5eb83429\" (UID: \"ccb910ab-dcef-4523-81df-c0fb5eb83429\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.140746 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-utilities" (OuterVolumeSpecName: "utilities") pod "ccb910ab-dcef-4523-81df-c0fb5eb83429" (UID: "ccb910ab-dcef-4523-81df-c0fb5eb83429"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.144271 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb910ab-dcef-4523-81df-c0fb5eb83429-kube-api-access-92fzw" (OuterVolumeSpecName: "kube-api-access-92fzw") pod "ccb910ab-dcef-4523-81df-c0fb5eb83429" (UID: "ccb910ab-dcef-4523-81df-c0fb5eb83429"). InnerVolumeSpecName "kube-api-access-92fzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.176575 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.183583 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.185299 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.215133 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccb910ab-dcef-4523-81df-c0fb5eb83429" (UID: "ccb910ab-dcef-4523-81df-c0fb5eb83429"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.240552 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.240585 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccb910ab-dcef-4523-81df-c0fb5eb83429-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.240603 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92fzw\" (UniqueName: \"kubernetes.io/projected/ccb910ab-dcef-4523-81df-c0fb5eb83429-kube-api-access-92fzw\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.241434 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.313950 4775 generic.go:334] "Generic (PLEG): container finished" podID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerID="dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b" exitCode=0 Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.314024 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjfwd" event={"ID":"70ad413a-5f81-4094-b2d8-9b89698c6e32","Type":"ContainerDied","Data":"dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.314051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjfwd" event={"ID":"70ad413a-5f81-4094-b2d8-9b89698c6e32","Type":"ContainerDied","Data":"f3ec71e4b8bce118032bd210cfae1bbf7c53e6c8a5aba3f4f045bc35e188b3c0"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.314067 4775 scope.go:117] "RemoveContainer" containerID="dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.314218 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjfwd" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.318732 4775 generic.go:334] "Generic (PLEG): container finished" podID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerID="aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96" exitCode=0 Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.318806 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-959kj" event={"ID":"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e","Type":"ContainerDied","Data":"aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.318849 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-959kj" event={"ID":"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e","Type":"ContainerDied","Data":"68f05861e537811c2b45eab073d2762025cd79d47c769cec2072bbee0ee4a322"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.318929 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-959kj" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.322498 4775 generic.go:334] "Generic (PLEG): container finished" podID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerID="53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2" exitCode=0 Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.322585 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" event={"ID":"dfbaac71-f99c-4373-a469-f2e5dd0ee632","Type":"ContainerDied","Data":"53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.322621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" event={"ID":"dfbaac71-f99c-4373-a469-f2e5dd0ee632","Type":"ContainerDied","Data":"73bfd1fd04d6bcc9e68b01d3bd224e4608501bca9349188673dd3d6681965588"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.322692 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wrzs4" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.326637 4775 generic.go:334] "Generic (PLEG): container finished" podID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerID="bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b" exitCode=0 Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.326702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hldm7" event={"ID":"571e84f2-a2bc-4f09-ac53-d4a4adafa80b","Type":"ContainerDied","Data":"bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.326732 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hldm7" event={"ID":"571e84f2-a2bc-4f09-ac53-d4a4adafa80b","Type":"ContainerDied","Data":"615ed550bf3a5c04a2f6727d10336a359668c9d0754381b6a62de7c552e1b1b1"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.327009 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hldm7" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.329210 4775 generic.go:334] "Generic (PLEG): container finished" podID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerID="02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a" exitCode=0 Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.329248 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbtgv" event={"ID":"ccb910ab-dcef-4523-81df-c0fb5eb83429","Type":"ContainerDied","Data":"02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.329269 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbtgv" event={"ID":"ccb910ab-dcef-4523-81df-c0fb5eb83429","Type":"ContainerDied","Data":"c9000a4269c650286aea4fb24ec488fdb1ea75dd0febdc2786cb6046e9da1f69"} Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.329342 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbtgv" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.333435 4775 scope.go:117] "RemoveContainer" containerID="1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380334 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-utilities\") pod \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380381 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpppf\" (UniqueName: \"kubernetes.io/projected/70ad413a-5f81-4094-b2d8-9b89698c6e32-kube-api-access-vpppf\") pod \"70ad413a-5f81-4094-b2d8-9b89698c6e32\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380426 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-operator-metrics\") pod \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380446 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m44cw\" (UniqueName: \"kubernetes.io/projected/dfbaac71-f99c-4373-a469-f2e5dd0ee632-kube-api-access-m44cw\") pod \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-catalog-content\") pod \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380497 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-catalog-content\") pod \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380521 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-utilities\") pod \"70ad413a-5f81-4094-b2d8-9b89698c6e32\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380550 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-utilities\") pod \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380567 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ww6h\" (UniqueName: \"kubernetes.io/projected/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-kube-api-access-7ww6h\") pod \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\" (UID: \"5bb65dee-cd5f-46b3-9e7d-36e5d182d19e\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380596 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-catalog-content\") pod \"70ad413a-5f81-4094-b2d8-9b89698c6e32\" (UID: \"70ad413a-5f81-4094-b2d8-9b89698c6e32\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380621 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gdv6\" (UniqueName: \"kubernetes.io/projected/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-kube-api-access-2gdv6\") pod \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\" (UID: \"571e84f2-a2bc-4f09-ac53-d4a4adafa80b\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.380642 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-trusted-ca\") pod \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\" (UID: \"dfbaac71-f99c-4373-a469-f2e5dd0ee632\") " Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.381478 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "dfbaac71-f99c-4373-a469-f2e5dd0ee632" (UID: "dfbaac71-f99c-4373-a469-f2e5dd0ee632"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.382100 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-utilities" (OuterVolumeSpecName: "utilities") pod "70ad413a-5f81-4094-b2d8-9b89698c6e32" (UID: "70ad413a-5f81-4094-b2d8-9b89698c6e32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.382910 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-utilities" (OuterVolumeSpecName: "utilities") pod "571e84f2-a2bc-4f09-ac53-d4a4adafa80b" (UID: "571e84f2-a2bc-4f09-ac53-d4a4adafa80b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.385344 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-utilities" (OuterVolumeSpecName: "utilities") pod "5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" (UID: "5bb65dee-cd5f-46b3-9e7d-36e5d182d19e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.401073 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-kube-api-access-7ww6h" (OuterVolumeSpecName: "kube-api-access-7ww6h") pod "5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" (UID: "5bb65dee-cd5f-46b3-9e7d-36e5d182d19e"). InnerVolumeSpecName "kube-api-access-7ww6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.404968 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbtgv"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.410303 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wbtgv"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.413012 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "dfbaac71-f99c-4373-a469-f2e5dd0ee632" (UID: "dfbaac71-f99c-4373-a469-f2e5dd0ee632"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.413096 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfbaac71-f99c-4373-a469-f2e5dd0ee632-kube-api-access-m44cw" (OuterVolumeSpecName: "kube-api-access-m44cw") pod "dfbaac71-f99c-4373-a469-f2e5dd0ee632" (UID: "dfbaac71-f99c-4373-a469-f2e5dd0ee632"). InnerVolumeSpecName "kube-api-access-m44cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.414016 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ad413a-5f81-4094-b2d8-9b89698c6e32-kube-api-access-vpppf" (OuterVolumeSpecName: "kube-api-access-vpppf") pod "70ad413a-5f81-4094-b2d8-9b89698c6e32" (UID: "70ad413a-5f81-4094-b2d8-9b89698c6e32"). InnerVolumeSpecName "kube-api-access-vpppf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.414237 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-kube-api-access-2gdv6" (OuterVolumeSpecName: "kube-api-access-2gdv6") pod "571e84f2-a2bc-4f09-ac53-d4a4adafa80b" (UID: "571e84f2-a2bc-4f09-ac53-d4a4adafa80b"). InnerVolumeSpecName "kube-api-access-2gdv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.414476 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7fcx"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.417539 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "571e84f2-a2bc-4f09-ac53-d4a4adafa80b" (UID: "571e84f2-a2bc-4f09-ac53-d4a4adafa80b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: W0321 04:54:29.418803 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59fec450_4b61_4a15_b1b5_b47dedd649a0.slice/crio-14ec881f38f46e3ad501a36d40d0dbbc592306dd4c1d73a4372f538296655a4f WatchSource:0}: Error finding container 14ec881f38f46e3ad501a36d40d0dbbc592306dd4c1d73a4372f538296655a4f: Status 404 returned error can't find the container with id 14ec881f38f46e3ad501a36d40d0dbbc592306dd4c1d73a4372f538296655a4f Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.422152 4775 scope.go:117] "RemoveContainer" containerID="3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.439102 4775 scope.go:117] "RemoveContainer" containerID="dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.439598 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b\": container with ID starting with dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b not found: ID does not exist" containerID="dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.439668 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b"} err="failed to get container status \"dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b\": rpc error: code = NotFound desc = could not find container \"dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b\": container with ID starting with dd9cc974af699ec57297810c7016806ecad201037353fbcf0dc303241f78fc0b not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.439703 4775 scope.go:117] "RemoveContainer" containerID="1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.440295 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0\": container with ID starting with 1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0 not found: ID does not exist" containerID="1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.440358 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0"} err="failed to get container status \"1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0\": rpc error: code = NotFound desc = could not find container \"1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0\": container with ID starting with 1098f33834501f7272dae8532213ab991de93513a5cd5e8028551095da4598b0 not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.440391 4775 scope.go:117] "RemoveContainer" containerID="3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.440689 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b\": container with ID starting with 3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b not found: ID does not exist" containerID="3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.440715 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b"} err="failed to get container status \"3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b\": rpc error: code = NotFound desc = could not find container \"3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b\": container with ID starting with 3b507bd384c25213b4044dd19f72806386309a77448d646faeed8c941611578b not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.440734 4775 scope.go:117] "RemoveContainer" containerID="aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.446602 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" (UID: "5bb65dee-cd5f-46b3-9e7d-36e5d182d19e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.461400 4775 scope.go:117] "RemoveContainer" containerID="8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.481934 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.481973 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m44cw\" (UniqueName: \"kubernetes.io/projected/dfbaac71-f99c-4373-a469-f2e5dd0ee632-kube-api-access-m44cw\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.481988 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482057 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482085 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482099 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482130 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ww6h\" (UniqueName: \"kubernetes.io/projected/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-kube-api-access-7ww6h\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482148 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gdv6\" (UniqueName: \"kubernetes.io/projected/571e84f2-a2bc-4f09-ac53-d4a4adafa80b-kube-api-access-2gdv6\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482164 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfbaac71-f99c-4373-a469-f2e5dd0ee632-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482180 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.482195 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpppf\" (UniqueName: \"kubernetes.io/projected/70ad413a-5f81-4094-b2d8-9b89698c6e32-kube-api-access-vpppf\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.489165 4775 scope.go:117] "RemoveContainer" containerID="c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.501109 4775 scope.go:117] "RemoveContainer" containerID="aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.501576 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96\": container with ID starting with aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96 not found: ID does not exist" containerID="aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.501617 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96"} err="failed to get container status \"aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96\": rpc error: code = NotFound desc = could not find container \"aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96\": container with ID starting with aaa815a5f0e220d5cd57bb1c4b415350dcf17e31c1b987c38d4e0714c4280f96 not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.501648 4775 scope.go:117] "RemoveContainer" containerID="8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.502046 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc\": container with ID starting with 8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc not found: ID does not exist" containerID="8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.502079 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc"} err="failed to get container status \"8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc\": rpc error: code = NotFound desc = could not find container \"8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc\": container with ID starting with 8b5a6baf6223e16aada12bd66d033b18e08ee43831676efdf6aa9934129688fc not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.502102 4775 scope.go:117] "RemoveContainer" containerID="c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.502413 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca\": container with ID starting with c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca not found: ID does not exist" containerID="c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.502432 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca"} err="failed to get container status \"c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca\": rpc error: code = NotFound desc = could not find container \"c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca\": container with ID starting with c8e5e4f89cea3a5393cd7ded536ab2f11744f74388f7645dc484cfc3075f65ca not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.502489 4775 scope.go:117] "RemoveContainer" containerID="53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.519889 4775 scope.go:117] "RemoveContainer" containerID="e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.537357 4775 scope.go:117] "RemoveContainer" containerID="53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.538202 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2\": container with ID starting with 53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2 not found: ID does not exist" containerID="53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.538313 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2"} err="failed to get container status \"53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2\": rpc error: code = NotFound desc = could not find container \"53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2\": container with ID starting with 53e93eb8775f797ec5701d0054608e0cc06d124aa1940ad067f4c33b12a82fa2 not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.538350 4775 scope.go:117] "RemoveContainer" containerID="e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.538629 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f\": container with ID starting with e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f not found: ID does not exist" containerID="e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.538651 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f"} err="failed to get container status \"e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f\": rpc error: code = NotFound desc = could not find container \"e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f\": container with ID starting with e6613d34aa689f2c33b46ca7fb2fff89fd26737df8f8eb26a52c2030909ae41f not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.538665 4775 scope.go:117] "RemoveContainer" containerID="bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.545846 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70ad413a-5f81-4094-b2d8-9b89698c6e32" (UID: "70ad413a-5f81-4094-b2d8-9b89698c6e32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.553567 4775 scope.go:117] "RemoveContainer" containerID="a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.583822 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ad413a-5f81-4094-b2d8-9b89698c6e32-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.607435 4775 scope.go:117] "RemoveContainer" containerID="ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.637418 4775 scope.go:117] "RemoveContainer" containerID="bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.637960 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b\": container with ID starting with bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b not found: ID does not exist" containerID="bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.638012 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b"} err="failed to get container status \"bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b\": rpc error: code = NotFound desc = could not find container \"bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b\": container with ID starting with bce5cf8972de39d9de712cead6829664ca56252c15f4837eb08ebb6d9c3a039b not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.638047 4775 scope.go:117] "RemoveContainer" containerID="a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.638397 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a\": container with ID starting with a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a not found: ID does not exist" containerID="a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.638424 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a"} err="failed to get container status \"a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a\": rpc error: code = NotFound desc = could not find container \"a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a\": container with ID starting with a669beb6a121e654078eda21a56b0d2171a5e04b1968ec341bfe87375574ff9a not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.638440 4775 scope.go:117] "RemoveContainer" containerID="ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.638762 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be\": container with ID starting with ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be not found: ID does not exist" containerID="ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.638799 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be"} err="failed to get container status \"ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be\": rpc error: code = NotFound desc = could not find container \"ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be\": container with ID starting with ee5a58cbfa0edd60aaa49923480f7f0b63c1139c1f29a46a06572d11e904c5be not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.638815 4775 scope.go:117] "RemoveContainer" containerID="02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.657093 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjfwd"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.661482 4775 scope.go:117] "RemoveContainer" containerID="21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.674509 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" path="/var/lib/kubelet/pods/ccb910ab-dcef-4523-81df-c0fb5eb83429/volumes" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.677702 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wjfwd"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.683636 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-959kj"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.688509 4775 scope.go:117] "RemoveContainer" containerID="9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.696975 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-959kj"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.701537 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wrzs4"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.705102 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wrzs4"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.708837 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hldm7"] Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.709082 4775 scope.go:117] "RemoveContainer" containerID="02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.709548 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a\": container with ID starting with 02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a not found: ID does not exist" containerID="02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.709608 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a"} err="failed to get container status \"02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a\": rpc error: code = NotFound desc = could not find container \"02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a\": container with ID starting with 02ab090064cf43714a361f98c8401a08ff5684170017c160d3affd080275fc5a not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.709644 4775 scope.go:117] "RemoveContainer" containerID="21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.710061 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d\": container with ID starting with 21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d not found: ID does not exist" containerID="21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.710093 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d"} err="failed to get container status \"21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d\": rpc error: code = NotFound desc = could not find container \"21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d\": container with ID starting with 21eb68b1dc783b9a4153f39746a113d0c901b8fcfacb75d4fa0041622f71eb2d not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.710125 4775 scope.go:117] "RemoveContainer" containerID="9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426" Mar 21 04:54:29 crc kubenswrapper[4775]: E0321 04:54:29.710601 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426\": container with ID starting with 9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426 not found: ID does not exist" containerID="9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.710623 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426"} err="failed to get container status \"9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426\": rpc error: code = NotFound desc = could not find container \"9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426\": container with ID starting with 9f77f33cd00f43111cd055698fe7da46bda8e9c64a949491d07dadf580cf4426 not found: ID does not exist" Mar 21 04:54:29 crc kubenswrapper[4775]: I0321 04:54:29.711780 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hldm7"] Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.335511 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" event={"ID":"59fec450-4b61-4a15-b1b5-b47dedd649a0","Type":"ContainerStarted","Data":"cca7682b1f9e0c6812bda17dc39c9c1c9aaa1c03a18dac4025f41b93dff335ff"} Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.335549 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" event={"ID":"59fec450-4b61-4a15-b1b5-b47dedd649a0","Type":"ContainerStarted","Data":"14ec881f38f46e3ad501a36d40d0dbbc592306dd4c1d73a4372f538296655a4f"} Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.335779 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.338867 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.367875 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z7fcx" podStartSLOduration=2.3678618289999998 podStartE2EDuration="2.367861829s" podCreationTimestamp="2026-03-21 04:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:54:30.351801832 +0000 UTC m=+423.328265486" watchObservedRunningTime="2026-03-21 04:54:30.367861829 +0000 UTC m=+423.344325453" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439376 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77cp9"] Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439605 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439620 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439635 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439643 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439656 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439665 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439679 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439686 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439698 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439705 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439716 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439724 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439735 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439742 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439751 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439758 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439771 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439778 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439789 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439795 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439805 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439812 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439822 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439829 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="extract-utilities" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439838 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439844 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: E0321 04:54:30.439856 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439863 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="extract-content" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439963 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb910ab-dcef-4523-81df-c0fb5eb83429" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439974 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439985 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.439994 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" containerName="registry-server" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.440002 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.440015 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" containerName="marketplace-operator" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.440878 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.446083 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.483367 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77cp9"] Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.495489 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dhm\" (UniqueName: \"kubernetes.io/projected/08558c65-4599-4c64-bb0f-f18f94cecdec-kube-api-access-q6dhm\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.495559 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08558c65-4599-4c64-bb0f-f18f94cecdec-catalog-content\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.495733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08558c65-4599-4c64-bb0f-f18f94cecdec-utilities\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.596993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08558c65-4599-4c64-bb0f-f18f94cecdec-utilities\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.597078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dhm\" (UniqueName: \"kubernetes.io/projected/08558c65-4599-4c64-bb0f-f18f94cecdec-kube-api-access-q6dhm\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.597147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08558c65-4599-4c64-bb0f-f18f94cecdec-catalog-content\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.597977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08558c65-4599-4c64-bb0f-f18f94cecdec-utilities\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.598034 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08558c65-4599-4c64-bb0f-f18f94cecdec-catalog-content\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.616054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dhm\" (UniqueName: \"kubernetes.io/projected/08558c65-4599-4c64-bb0f-f18f94cecdec-kube-api-access-q6dhm\") pod \"certified-operators-77cp9\" (UID: \"08558c65-4599-4c64-bb0f-f18f94cecdec\") " pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:30 crc kubenswrapper[4775]: I0321 04:54:30.797354 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.042775 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h4wmk"] Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.043702 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.045654 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.053757 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4wmk"] Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.102269 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9e7d52-67b3-4a38-a978-6566b1c7870a-catalog-content\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.102339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9e7d52-67b3-4a38-a978-6566b1c7870a-utilities\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.102404 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhv8\" (UniqueName: \"kubernetes.io/projected/4a9e7d52-67b3-4a38-a978-6566b1c7870a-kube-api-access-hzhv8\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.183688 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77cp9"] Mar 21 04:54:31 crc kubenswrapper[4775]: W0321 04:54:31.189794 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08558c65_4599_4c64_bb0f_f18f94cecdec.slice/crio-53d8511c66dacf6ae929a89765dca05bde8c11649a74ae9891aa53b8786e1f13 WatchSource:0}: Error finding container 53d8511c66dacf6ae929a89765dca05bde8c11649a74ae9891aa53b8786e1f13: Status 404 returned error can't find the container with id 53d8511c66dacf6ae929a89765dca05bde8c11649a74ae9891aa53b8786e1f13 Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.203061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhv8\" (UniqueName: \"kubernetes.io/projected/4a9e7d52-67b3-4a38-a978-6566b1c7870a-kube-api-access-hzhv8\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.203440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9e7d52-67b3-4a38-a978-6566b1c7870a-catalog-content\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.203481 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9e7d52-67b3-4a38-a978-6566b1c7870a-utilities\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.203941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a9e7d52-67b3-4a38-a978-6566b1c7870a-utilities\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.203978 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a9e7d52-67b3-4a38-a978-6566b1c7870a-catalog-content\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.222107 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhv8\" (UniqueName: \"kubernetes.io/projected/4a9e7d52-67b3-4a38-a978-6566b1c7870a-kube-api-access-hzhv8\") pod \"redhat-marketplace-h4wmk\" (UID: \"4a9e7d52-67b3-4a38-a978-6566b1c7870a\") " pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.348160 4775 generic.go:334] "Generic (PLEG): container finished" podID="08558c65-4599-4c64-bb0f-f18f94cecdec" containerID="c1c5aff8dc874c9ce56fbe3e84035b64de894d7afcf43417e89df45fea4306ba" exitCode=0 Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.348255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77cp9" event={"ID":"08558c65-4599-4c64-bb0f-f18f94cecdec","Type":"ContainerDied","Data":"c1c5aff8dc874c9ce56fbe3e84035b64de894d7afcf43417e89df45fea4306ba"} Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.348310 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77cp9" event={"ID":"08558c65-4599-4c64-bb0f-f18f94cecdec","Type":"ContainerStarted","Data":"53d8511c66dacf6ae929a89765dca05bde8c11649a74ae9891aa53b8786e1f13"} Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.370094 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.672295 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571e84f2-a2bc-4f09-ac53-d4a4adafa80b" path="/var/lib/kubelet/pods/571e84f2-a2bc-4f09-ac53-d4a4adafa80b/volumes" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.673195 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bb65dee-cd5f-46b3-9e7d-36e5d182d19e" path="/var/lib/kubelet/pods/5bb65dee-cd5f-46b3-9e7d-36e5d182d19e/volumes" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.673963 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ad413a-5f81-4094-b2d8-9b89698c6e32" path="/var/lib/kubelet/pods/70ad413a-5f81-4094-b2d8-9b89698c6e32/volumes" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.675043 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfbaac71-f99c-4373-a469-f2e5dd0ee632" path="/var/lib/kubelet/pods/dfbaac71-f99c-4373-a469-f2e5dd0ee632/volumes" Mar 21 04:54:31 crc kubenswrapper[4775]: I0321 04:54:31.746160 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4wmk"] Mar 21 04:54:31 crc kubenswrapper[4775]: W0321 04:54:31.752835 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a9e7d52_67b3_4a38_a978_6566b1c7870a.slice/crio-89817b57585c98f6ed285ff85f3fa80e94e56cc7b45cf060291a6b8ce625b50b WatchSource:0}: Error finding container 89817b57585c98f6ed285ff85f3fa80e94e56cc7b45cf060291a6b8ce625b50b: Status 404 returned error can't find the container with id 89817b57585c98f6ed285ff85f3fa80e94e56cc7b45cf060291a6b8ce625b50b Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.354625 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77cp9" event={"ID":"08558c65-4599-4c64-bb0f-f18f94cecdec","Type":"ContainerStarted","Data":"8adc4d44e75efefa651623f47e19f981358f6762d49aaf60f4f2e126668c06b9"} Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.355780 4775 generic.go:334] "Generic (PLEG): container finished" podID="4a9e7d52-67b3-4a38-a978-6566b1c7870a" containerID="713719d27d87bbfcfa4cd0e037e75a5bcdb5f83d1603e453d967d0244b5bf4de" exitCode=0 Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.355996 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4wmk" event={"ID":"4a9e7d52-67b3-4a38-a978-6566b1c7870a","Type":"ContainerDied","Data":"713719d27d87bbfcfa4cd0e037e75a5bcdb5f83d1603e453d967d0244b5bf4de"} Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.356037 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4wmk" event={"ID":"4a9e7d52-67b3-4a38-a978-6566b1c7870a","Type":"ContainerStarted","Data":"89817b57585c98f6ed285ff85f3fa80e94e56cc7b45cf060291a6b8ce625b50b"} Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.481878 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.481942 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.848234 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gslhk"] Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.849736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.851933 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:54:32 crc kubenswrapper[4775]: I0321 04:54:32.854346 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gslhk"] Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.024228 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9p7c\" (UniqueName: \"kubernetes.io/projected/373fa4a4-80b8-4b32-a9ee-b272604d3adc-kube-api-access-h9p7c\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.024282 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/373fa4a4-80b8-4b32-a9ee-b272604d3adc-utilities\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.024445 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/373fa4a4-80b8-4b32-a9ee-b272604d3adc-catalog-content\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.125953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9p7c\" (UniqueName: \"kubernetes.io/projected/373fa4a4-80b8-4b32-a9ee-b272604d3adc-kube-api-access-h9p7c\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.126010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/373fa4a4-80b8-4b32-a9ee-b272604d3adc-utilities\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.126083 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/373fa4a4-80b8-4b32-a9ee-b272604d3adc-catalog-content\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.126547 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/373fa4a4-80b8-4b32-a9ee-b272604d3adc-catalog-content\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.126725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/373fa4a4-80b8-4b32-a9ee-b272604d3adc-utilities\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.159901 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9p7c\" (UniqueName: \"kubernetes.io/projected/373fa4a4-80b8-4b32-a9ee-b272604d3adc-kube-api-access-h9p7c\") pod \"redhat-operators-gslhk\" (UID: \"373fa4a4-80b8-4b32-a9ee-b272604d3adc\") " pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.175180 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.384109 4775 generic.go:334] "Generic (PLEG): container finished" podID="4a9e7d52-67b3-4a38-a978-6566b1c7870a" containerID="64eb994ce0af079d74aaa5eb9da00137399bcf89c4c62a054ea446143999c5d6" exitCode=0 Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.384460 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4wmk" event={"ID":"4a9e7d52-67b3-4a38-a978-6566b1c7870a","Type":"ContainerDied","Data":"64eb994ce0af079d74aaa5eb9da00137399bcf89c4c62a054ea446143999c5d6"} Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.388235 4775 generic.go:334] "Generic (PLEG): container finished" podID="08558c65-4599-4c64-bb0f-f18f94cecdec" containerID="8adc4d44e75efefa651623f47e19f981358f6762d49aaf60f4f2e126668c06b9" exitCode=0 Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.388266 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77cp9" event={"ID":"08558c65-4599-4c64-bb0f-f18f94cecdec","Type":"ContainerDied","Data":"8adc4d44e75efefa651623f47e19f981358f6762d49aaf60f4f2e126668c06b9"} Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.438830 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x59zn"] Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.439919 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.442149 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.454649 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x59zn"] Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.549005 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gslhk"] Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.631718 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz69p\" (UniqueName: \"kubernetes.io/projected/e8513867-3729-4d93-b8ca-45ecb69c50e6-kube-api-access-xz69p\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.631769 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8513867-3729-4d93-b8ca-45ecb69c50e6-utilities\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.631806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8513867-3729-4d93-b8ca-45ecb69c50e6-catalog-content\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.733361 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz69p\" (UniqueName: \"kubernetes.io/projected/e8513867-3729-4d93-b8ca-45ecb69c50e6-kube-api-access-xz69p\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.733444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8513867-3729-4d93-b8ca-45ecb69c50e6-utilities\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.733543 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8513867-3729-4d93-b8ca-45ecb69c50e6-catalog-content\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.733960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8513867-3729-4d93-b8ca-45ecb69c50e6-utilities\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.734488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8513867-3729-4d93-b8ca-45ecb69c50e6-catalog-content\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.752664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz69p\" (UniqueName: \"kubernetes.io/projected/e8513867-3729-4d93-b8ca-45ecb69c50e6-kube-api-access-xz69p\") pod \"community-operators-x59zn\" (UID: \"e8513867-3729-4d93-b8ca-45ecb69c50e6\") " pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:33 crc kubenswrapper[4775]: I0321 04:54:33.758215 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.185228 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x59zn"] Mar 21 04:54:34 crc kubenswrapper[4775]: W0321 04:54:34.199276 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8513867_3729_4d93_b8ca_45ecb69c50e6.slice/crio-97fef89995b7abeabce972f4dd29e44dba02fd1652df55c292b81bae7cf2ee8a WatchSource:0}: Error finding container 97fef89995b7abeabce972f4dd29e44dba02fd1652df55c292b81bae7cf2ee8a: Status 404 returned error can't find the container with id 97fef89995b7abeabce972f4dd29e44dba02fd1652df55c292b81bae7cf2ee8a Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.404334 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4wmk" event={"ID":"4a9e7d52-67b3-4a38-a978-6566b1c7870a","Type":"ContainerStarted","Data":"cc1ad29ec9a597dea032729d906af4bdb7605800ee74cf73471404f8b9a6d900"} Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.407457 4775 generic.go:334] "Generic (PLEG): container finished" podID="373fa4a4-80b8-4b32-a9ee-b272604d3adc" containerID="c99df84a5860f15fae9b8f3adb088d68432645354dfffca2f6de63a527e05f0e" exitCode=0 Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.407543 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gslhk" event={"ID":"373fa4a4-80b8-4b32-a9ee-b272604d3adc","Type":"ContainerDied","Data":"c99df84a5860f15fae9b8f3adb088d68432645354dfffca2f6de63a527e05f0e"} Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.407571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gslhk" event={"ID":"373fa4a4-80b8-4b32-a9ee-b272604d3adc","Type":"ContainerStarted","Data":"e54391c9eb367f896efa6df1cd7ef84b9610e98a86ff539f47326508cb83fcf3"} Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.410181 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77cp9" event={"ID":"08558c65-4599-4c64-bb0f-f18f94cecdec","Type":"ContainerStarted","Data":"868c4b07f4de3727b8ae04aba6da0fe9b79906be41d2cd1280a77062daa256dd"} Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.414424 4775 generic.go:334] "Generic (PLEG): container finished" podID="e8513867-3729-4d93-b8ca-45ecb69c50e6" containerID="5d48c18f3faab589bc7d3fc497a7cf0764dd1a7869444eb3720b010a70b870c2" exitCode=0 Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.414480 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x59zn" event={"ID":"e8513867-3729-4d93-b8ca-45ecb69c50e6","Type":"ContainerDied","Data":"5d48c18f3faab589bc7d3fc497a7cf0764dd1a7869444eb3720b010a70b870c2"} Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.414539 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x59zn" event={"ID":"e8513867-3729-4d93-b8ca-45ecb69c50e6","Type":"ContainerStarted","Data":"97fef89995b7abeabce972f4dd29e44dba02fd1652df55c292b81bae7cf2ee8a"} Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.430908 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h4wmk" podStartSLOduration=1.926252817 podStartE2EDuration="3.430891419s" podCreationTimestamp="2026-03-21 04:54:31 +0000 UTC" firstStartedPulling="2026-03-21 04:54:32.362665891 +0000 UTC m=+425.339129515" lastFinishedPulling="2026-03-21 04:54:33.867304493 +0000 UTC m=+426.843768117" observedRunningTime="2026-03-21 04:54:34.429537203 +0000 UTC m=+427.406000827" watchObservedRunningTime="2026-03-21 04:54:34.430891419 +0000 UTC m=+427.407355043" Mar 21 04:54:34 crc kubenswrapper[4775]: I0321 04:54:34.508457 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77cp9" podStartSLOduration=2.060206366 podStartE2EDuration="4.508436942s" podCreationTimestamp="2026-03-21 04:54:30 +0000 UTC" firstStartedPulling="2026-03-21 04:54:31.349333142 +0000 UTC m=+424.325796766" lastFinishedPulling="2026-03-21 04:54:33.797563718 +0000 UTC m=+426.774027342" observedRunningTime="2026-03-21 04:54:34.507673441 +0000 UTC m=+427.484137065" watchObservedRunningTime="2026-03-21 04:54:34.508436942 +0000 UTC m=+427.484900566" Mar 21 04:54:35 crc kubenswrapper[4775]: I0321 04:54:35.421173 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gslhk" event={"ID":"373fa4a4-80b8-4b32-a9ee-b272604d3adc","Type":"ContainerStarted","Data":"872ab07c719410279fd52976d7026dd1e51fa3df57a56a17fe5312e830ff9260"} Mar 21 04:54:35 crc kubenswrapper[4775]: I0321 04:54:35.422859 4775 generic.go:334] "Generic (PLEG): container finished" podID="e8513867-3729-4d93-b8ca-45ecb69c50e6" containerID="ec1634b3909c3b128dbf98f5658155ac371c891b6e1b1b64672b4b0ba678b266" exitCode=0 Mar 21 04:54:35 crc kubenswrapper[4775]: I0321 04:54:35.423232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x59zn" event={"ID":"e8513867-3729-4d93-b8ca-45ecb69c50e6","Type":"ContainerDied","Data":"ec1634b3909c3b128dbf98f5658155ac371c891b6e1b1b64672b4b0ba678b266"} Mar 21 04:54:36 crc kubenswrapper[4775]: I0321 04:54:36.431797 4775 generic.go:334] "Generic (PLEG): container finished" podID="373fa4a4-80b8-4b32-a9ee-b272604d3adc" containerID="872ab07c719410279fd52976d7026dd1e51fa3df57a56a17fe5312e830ff9260" exitCode=0 Mar 21 04:54:36 crc kubenswrapper[4775]: I0321 04:54:36.431835 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gslhk" event={"ID":"373fa4a4-80b8-4b32-a9ee-b272604d3adc","Type":"ContainerDied","Data":"872ab07c719410279fd52976d7026dd1e51fa3df57a56a17fe5312e830ff9260"} Mar 21 04:54:37 crc kubenswrapper[4775]: I0321 04:54:37.438776 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gslhk" event={"ID":"373fa4a4-80b8-4b32-a9ee-b272604d3adc","Type":"ContainerStarted","Data":"e3764d781193778add49d893cc700b1f8d1f9c234b6c2effbeb58f7c8db40da8"} Mar 21 04:54:37 crc kubenswrapper[4775]: I0321 04:54:37.441160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x59zn" event={"ID":"e8513867-3729-4d93-b8ca-45ecb69c50e6","Type":"ContainerStarted","Data":"9ad826a8c49b3029fe8ea5e2f130d732e68b23774780ea22f8580b36613c36b8"} Mar 21 04:54:37 crc kubenswrapper[4775]: I0321 04:54:37.460093 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gslhk" podStartSLOduration=3.02030458 podStartE2EDuration="5.460075211s" podCreationTimestamp="2026-03-21 04:54:32 +0000 UTC" firstStartedPulling="2026-03-21 04:54:34.409959712 +0000 UTC m=+427.386423336" lastFinishedPulling="2026-03-21 04:54:36.849730343 +0000 UTC m=+429.826193967" observedRunningTime="2026-03-21 04:54:37.459887556 +0000 UTC m=+430.436351180" watchObservedRunningTime="2026-03-21 04:54:37.460075211 +0000 UTC m=+430.436538845" Mar 21 04:54:37 crc kubenswrapper[4775]: I0321 04:54:37.486343 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x59zn" podStartSLOduration=2.413897692 podStartE2EDuration="4.486321169s" podCreationTimestamp="2026-03-21 04:54:33 +0000 UTC" firstStartedPulling="2026-03-21 04:54:34.415786357 +0000 UTC m=+427.392249981" lastFinishedPulling="2026-03-21 04:54:36.488209834 +0000 UTC m=+429.464673458" observedRunningTime="2026-03-21 04:54:37.474095194 +0000 UTC m=+430.450558828" watchObservedRunningTime="2026-03-21 04:54:37.486321169 +0000 UTC m=+430.462784793" Mar 21 04:54:40 crc kubenswrapper[4775]: I0321 04:54:40.798492 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:40 crc kubenswrapper[4775]: I0321 04:54:40.798788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:40 crc kubenswrapper[4775]: I0321 04:54:40.847566 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:41 crc kubenswrapper[4775]: I0321 04:54:41.370422 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:41 crc kubenswrapper[4775]: I0321 04:54:41.370482 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:41 crc kubenswrapper[4775]: I0321 04:54:41.411179 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:41 crc kubenswrapper[4775]: I0321 04:54:41.498601 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77cp9" Mar 21 04:54:41 crc kubenswrapper[4775]: I0321 04:54:41.498662 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h4wmk" Mar 21 04:54:43 crc kubenswrapper[4775]: I0321 04:54:43.175808 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:43 crc kubenswrapper[4775]: I0321 04:54:43.176855 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:43 crc kubenswrapper[4775]: I0321 04:54:43.217563 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:43 crc kubenswrapper[4775]: I0321 04:54:43.511416 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gslhk" Mar 21 04:54:43 crc kubenswrapper[4775]: I0321 04:54:43.759205 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:43 crc kubenswrapper[4775]: I0321 04:54:43.759269 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:43 crc kubenswrapper[4775]: I0321 04:54:43.795751 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:44 crc kubenswrapper[4775]: I0321 04:54:44.514360 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x59zn" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.315244 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" podUID="60cd23b8-6e1c-492e-aeb5-8b16609d06d1" containerName="registry" containerID="cri-o://126ca17ac1331046f38ef3e95c39b37b4252bef89c1a0511f475a2f5ad46576e" gracePeriod=30 Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.519959 4775 generic.go:334] "Generic (PLEG): container finished" podID="60cd23b8-6e1c-492e-aeb5-8b16609d06d1" containerID="126ca17ac1331046f38ef3e95c39b37b4252bef89c1a0511f475a2f5ad46576e" exitCode=0 Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.520047 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" event={"ID":"60cd23b8-6e1c-492e-aeb5-8b16609d06d1","Type":"ContainerDied","Data":"126ca17ac1331046f38ef3e95c39b37b4252bef89c1a0511f475a2f5ad46576e"} Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.735068 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773159 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-ca-trust-extracted\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773355 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-bound-sa-token\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773380 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-certificates\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773401 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2j66\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-kube-api-access-z2j66\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773433 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-trusted-ca\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773450 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-tls\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.773470 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-installation-pull-secrets\") pod \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\" (UID: \"60cd23b8-6e1c-492e-aeb5-8b16609d06d1\") " Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.774368 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.774982 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.779047 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.779280 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.784193 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.788948 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-kube-api-access-z2j66" (OuterVolumeSpecName: "kube-api-access-z2j66") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "kube-api-access-z2j66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.793471 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.794999 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "60cd23b8-6e1c-492e-aeb5-8b16609d06d1" (UID: "60cd23b8-6e1c-492e-aeb5-8b16609d06d1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.874768 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.874802 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.874814 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.874823 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.874830 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.874838 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2j66\" (UniqueName: \"kubernetes.io/projected/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-kube-api-access-z2j66\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:52 crc kubenswrapper[4775]: I0321 04:54:52.874846 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60cd23b8-6e1c-492e-aeb5-8b16609d06d1-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:53 crc kubenswrapper[4775]: I0321 04:54:53.527509 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" event={"ID":"60cd23b8-6e1c-492e-aeb5-8b16609d06d1","Type":"ContainerDied","Data":"224b2bc46f2dabb71d4e0596c8a7cb1f7d3a4099e704fb5ab5882b52edcb5f8a"} Mar 21 04:54:53 crc kubenswrapper[4775]: I0321 04:54:53.527600 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hvd8t" Mar 21 04:54:53 crc kubenswrapper[4775]: I0321 04:54:53.527941 4775 scope.go:117] "RemoveContainer" containerID="126ca17ac1331046f38ef3e95c39b37b4252bef89c1a0511f475a2f5ad46576e" Mar 21 04:54:53 crc kubenswrapper[4775]: I0321 04:54:53.568260 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hvd8t"] Mar 21 04:54:53 crc kubenswrapper[4775]: I0321 04:54:53.574595 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hvd8t"] Mar 21 04:54:53 crc kubenswrapper[4775]: I0321 04:54:53.669311 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cd23b8-6e1c-492e-aeb5-8b16609d06d1" path="/var/lib/kubelet/pods/60cd23b8-6e1c-492e-aeb5-8b16609d06d1/volumes" Mar 21 04:55:02 crc kubenswrapper[4775]: I0321 04:55:02.482707 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:55:02 crc kubenswrapper[4775]: I0321 04:55:02.483158 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.482808 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.483411 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.483460 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.483992 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bce245e457399f7eaba83f68d2114b3ba9e57f5a921fcddf334a4766f16c7398"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.484035 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://bce245e457399f7eaba83f68d2114b3ba9e57f5a921fcddf334a4766f16c7398" gracePeriod=600 Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.751040 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="bce245e457399f7eaba83f68d2114b3ba9e57f5a921fcddf334a4766f16c7398" exitCode=0 Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.751168 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"bce245e457399f7eaba83f68d2114b3ba9e57f5a921fcddf334a4766f16c7398"} Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.751475 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"948421a752c90ac0f2fbc508e5358894c1d0ccf82922efe510f0505bdc2c2715"} Mar 21 04:55:32 crc kubenswrapper[4775]: I0321 04:55:32.751520 4775 scope.go:117] "RemoveContainer" containerID="94bb084d125f86725a85d0175e0edb9139a2dae93d13cf9dccf120d03cbeafee" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.131407 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567816-7wggn"] Mar 21 04:56:00 crc kubenswrapper[4775]: E0321 04:56:00.132194 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cd23b8-6e1c-492e-aeb5-8b16609d06d1" containerName="registry" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.132211 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cd23b8-6e1c-492e-aeb5-8b16609d06d1" containerName="registry" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.132337 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="60cd23b8-6e1c-492e-aeb5-8b16609d06d1" containerName="registry" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.134499 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-7wggn" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.136793 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.136824 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.137329 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.145078 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-7wggn"] Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.228850 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d4bx\" (UniqueName: \"kubernetes.io/projected/1edcd264-4604-481b-b146-1ed5a34badba-kube-api-access-9d4bx\") pod \"auto-csr-approver-29567816-7wggn\" (UID: \"1edcd264-4604-481b-b146-1ed5a34badba\") " pod="openshift-infra/auto-csr-approver-29567816-7wggn" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.329868 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d4bx\" (UniqueName: \"kubernetes.io/projected/1edcd264-4604-481b-b146-1ed5a34badba-kube-api-access-9d4bx\") pod \"auto-csr-approver-29567816-7wggn\" (UID: \"1edcd264-4604-481b-b146-1ed5a34badba\") " pod="openshift-infra/auto-csr-approver-29567816-7wggn" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.349017 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d4bx\" (UniqueName: \"kubernetes.io/projected/1edcd264-4604-481b-b146-1ed5a34badba-kube-api-access-9d4bx\") pod \"auto-csr-approver-29567816-7wggn\" (UID: \"1edcd264-4604-481b-b146-1ed5a34badba\") " pod="openshift-infra/auto-csr-approver-29567816-7wggn" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.455937 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-7wggn" Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.824849 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-7wggn"] Mar 21 04:56:00 crc kubenswrapper[4775]: I0321 04:56:00.911266 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-7wggn" event={"ID":"1edcd264-4604-481b-b146-1ed5a34badba","Type":"ContainerStarted","Data":"f56a26e1863543e016e4efa5a83880872379157408d013285603de0a253952d7"} Mar 21 04:56:01 crc kubenswrapper[4775]: I0321 04:56:01.918730 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-7wggn" event={"ID":"1edcd264-4604-481b-b146-1ed5a34badba","Type":"ContainerStarted","Data":"67295240b136b0bcf5b3e83b034399ed927b48c5da3ccf9341e8f2fa7f873c3a"} Mar 21 04:56:01 crc kubenswrapper[4775]: I0321 04:56:01.930671 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567816-7wggn" podStartSLOduration=1.154532562 podStartE2EDuration="1.930651583s" podCreationTimestamp="2026-03-21 04:56:00 +0000 UTC" firstStartedPulling="2026-03-21 04:56:00.836583609 +0000 UTC m=+513.813047233" lastFinishedPulling="2026-03-21 04:56:01.61270263 +0000 UTC m=+514.589166254" observedRunningTime="2026-03-21 04:56:01.929157044 +0000 UTC m=+514.905620688" watchObservedRunningTime="2026-03-21 04:56:01.930651583 +0000 UTC m=+514.907115207" Mar 21 04:56:02 crc kubenswrapper[4775]: E0321 04:56:02.129196 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1edcd264_4604_481b_b146_1ed5a34badba.slice/crio-67295240b136b0bcf5b3e83b034399ed927b48c5da3ccf9341e8f2fa7f873c3a.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:56:02 crc kubenswrapper[4775]: I0321 04:56:02.927521 4775 generic.go:334] "Generic (PLEG): container finished" podID="1edcd264-4604-481b-b146-1ed5a34badba" containerID="67295240b136b0bcf5b3e83b034399ed927b48c5da3ccf9341e8f2fa7f873c3a" exitCode=0 Mar 21 04:56:02 crc kubenswrapper[4775]: I0321 04:56:02.927590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-7wggn" event={"ID":"1edcd264-4604-481b-b146-1ed5a34badba","Type":"ContainerDied","Data":"67295240b136b0bcf5b3e83b034399ed927b48c5da3ccf9341e8f2fa7f873c3a"} Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.149649 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-7wggn" Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.180186 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d4bx\" (UniqueName: \"kubernetes.io/projected/1edcd264-4604-481b-b146-1ed5a34badba-kube-api-access-9d4bx\") pod \"1edcd264-4604-481b-b146-1ed5a34badba\" (UID: \"1edcd264-4604-481b-b146-1ed5a34badba\") " Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.188955 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1edcd264-4604-481b-b146-1ed5a34badba-kube-api-access-9d4bx" (OuterVolumeSpecName: "kube-api-access-9d4bx") pod "1edcd264-4604-481b-b146-1ed5a34badba" (UID: "1edcd264-4604-481b-b146-1ed5a34badba"). InnerVolumeSpecName "kube-api-access-9d4bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.281362 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d4bx\" (UniqueName: \"kubernetes.io/projected/1edcd264-4604-481b-b146-1ed5a34badba-kube-api-access-9d4bx\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.939571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-7wggn" event={"ID":"1edcd264-4604-481b-b146-1ed5a34badba","Type":"ContainerDied","Data":"f56a26e1863543e016e4efa5a83880872379157408d013285603de0a253952d7"} Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.939907 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f56a26e1863543e016e4efa5a83880872379157408d013285603de0a253952d7" Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.939655 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-7wggn" Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.985855 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-wb89g"] Mar 21 04:56:04 crc kubenswrapper[4775]: I0321 04:56:04.989902 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-wb89g"] Mar 21 04:56:05 crc kubenswrapper[4775]: I0321 04:56:05.670874 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd25c8a4-8047-4602-a95b-3308af65bd38" path="/var/lib/kubelet/pods/cd25c8a4-8047-4602-a95b-3308af65bd38/volumes" Mar 21 04:57:32 crc kubenswrapper[4775]: I0321 04:57:32.482745 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:57:32 crc kubenswrapper[4775]: I0321 04:57:32.483308 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:57:37 crc kubenswrapper[4775]: I0321 04:57:37.022965 4775 scope.go:117] "RemoveContainer" containerID="5c917ed3fbfe2c82bdf6a98ab031c5d0f484f2833ad0eb7cad0c0b48f3a9128d" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.144286 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567818-42hv5"] Mar 21 04:58:00 crc kubenswrapper[4775]: E0321 04:58:00.145164 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edcd264-4604-481b-b146-1ed5a34badba" containerName="oc" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.145185 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edcd264-4604-481b-b146-1ed5a34badba" containerName="oc" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.145354 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edcd264-4604-481b-b146-1ed5a34badba" containerName="oc" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.145891 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-42hv5" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.148959 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.149226 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.149644 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.162184 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-42hv5"] Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.320679 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b5mh\" (UniqueName: \"kubernetes.io/projected/40cf5461-a1ca-40cb-9ed3-9c2de90faad3-kube-api-access-4b5mh\") pod \"auto-csr-approver-29567818-42hv5\" (UID: \"40cf5461-a1ca-40cb-9ed3-9c2de90faad3\") " pod="openshift-infra/auto-csr-approver-29567818-42hv5" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.421717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b5mh\" (UniqueName: \"kubernetes.io/projected/40cf5461-a1ca-40cb-9ed3-9c2de90faad3-kube-api-access-4b5mh\") pod \"auto-csr-approver-29567818-42hv5\" (UID: \"40cf5461-a1ca-40cb-9ed3-9c2de90faad3\") " pod="openshift-infra/auto-csr-approver-29567818-42hv5" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.445139 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b5mh\" (UniqueName: \"kubernetes.io/projected/40cf5461-a1ca-40cb-9ed3-9c2de90faad3-kube-api-access-4b5mh\") pod \"auto-csr-approver-29567818-42hv5\" (UID: \"40cf5461-a1ca-40cb-9ed3-9c2de90faad3\") " pod="openshift-infra/auto-csr-approver-29567818-42hv5" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.475889 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-42hv5" Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.709708 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-42hv5"] Mar 21 04:58:00 crc kubenswrapper[4775]: I0321 04:58:00.720389 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:58:01 crc kubenswrapper[4775]: I0321 04:58:01.613237 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-42hv5" event={"ID":"40cf5461-a1ca-40cb-9ed3-9c2de90faad3","Type":"ContainerStarted","Data":"08556185bd2b965413587182184a1d1685909870272fa6ed0f264c756e900c6b"} Mar 21 04:58:02 crc kubenswrapper[4775]: I0321 04:58:02.482466 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:58:02 crc kubenswrapper[4775]: I0321 04:58:02.483066 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:58:02 crc kubenswrapper[4775]: I0321 04:58:02.624199 4775 generic.go:334] "Generic (PLEG): container finished" podID="40cf5461-a1ca-40cb-9ed3-9c2de90faad3" containerID="a6f778f857c6aa3ee7f297014df3be268b9de30de7297b84506cc38e86b9d649" exitCode=0 Mar 21 04:58:02 crc kubenswrapper[4775]: I0321 04:58:02.624276 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-42hv5" event={"ID":"40cf5461-a1ca-40cb-9ed3-9c2de90faad3","Type":"ContainerDied","Data":"a6f778f857c6aa3ee7f297014df3be268b9de30de7297b84506cc38e86b9d649"} Mar 21 04:58:03 crc kubenswrapper[4775]: I0321 04:58:03.852669 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-42hv5" Mar 21 04:58:03 crc kubenswrapper[4775]: I0321 04:58:03.967460 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b5mh\" (UniqueName: \"kubernetes.io/projected/40cf5461-a1ca-40cb-9ed3-9c2de90faad3-kube-api-access-4b5mh\") pod \"40cf5461-a1ca-40cb-9ed3-9c2de90faad3\" (UID: \"40cf5461-a1ca-40cb-9ed3-9c2de90faad3\") " Mar 21 04:58:03 crc kubenswrapper[4775]: I0321 04:58:03.973955 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cf5461-a1ca-40cb-9ed3-9c2de90faad3-kube-api-access-4b5mh" (OuterVolumeSpecName: "kube-api-access-4b5mh") pod "40cf5461-a1ca-40cb-9ed3-9c2de90faad3" (UID: "40cf5461-a1ca-40cb-9ed3-9c2de90faad3"). InnerVolumeSpecName "kube-api-access-4b5mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:04 crc kubenswrapper[4775]: I0321 04:58:04.069112 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b5mh\" (UniqueName: \"kubernetes.io/projected/40cf5461-a1ca-40cb-9ed3-9c2de90faad3-kube-api-access-4b5mh\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:04 crc kubenswrapper[4775]: I0321 04:58:04.637289 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-42hv5" event={"ID":"40cf5461-a1ca-40cb-9ed3-9c2de90faad3","Type":"ContainerDied","Data":"08556185bd2b965413587182184a1d1685909870272fa6ed0f264c756e900c6b"} Mar 21 04:58:04 crc kubenswrapper[4775]: I0321 04:58:04.637345 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08556185bd2b965413587182184a1d1685909870272fa6ed0f264c756e900c6b" Mar 21 04:58:04 crc kubenswrapper[4775]: I0321 04:58:04.637365 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-42hv5" Mar 21 04:58:04 crc kubenswrapper[4775]: I0321 04:58:04.918053 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-6wppn"] Mar 21 04:58:04 crc kubenswrapper[4775]: I0321 04:58:04.922449 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-6wppn"] Mar 21 04:58:05 crc kubenswrapper[4775]: I0321 04:58:05.672292 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8101654-10fd-404b-ae0f-a098719418f4" path="/var/lib/kubelet/pods/d8101654-10fd-404b-ae0f-a098719418f4/volumes" Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.482661 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.483289 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.483339 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.484024 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"948421a752c90ac0f2fbc508e5358894c1d0ccf82922efe510f0505bdc2c2715"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.484099 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://948421a752c90ac0f2fbc508e5358894c1d0ccf82922efe510f0505bdc2c2715" gracePeriod=600 Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.825800 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="948421a752c90ac0f2fbc508e5358894c1d0ccf82922efe510f0505bdc2c2715" exitCode=0 Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.825901 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"948421a752c90ac0f2fbc508e5358894c1d0ccf82922efe510f0505bdc2c2715"} Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.826190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"eaf646a6237d4b4ce6a8a82755505f01a336cf93fef407f02dbf82d68f5008b4"} Mar 21 04:58:32 crc kubenswrapper[4775]: I0321 04:58:32.826239 4775 scope.go:117] "RemoveContainer" containerID="bce245e457399f7eaba83f68d2114b3ba9e57f5a921fcddf334a4766f16c7398" Mar 21 04:58:37 crc kubenswrapper[4775]: I0321 04:58:37.075384 4775 scope.go:117] "RemoveContainer" containerID="4cd8fe98e0e79605364ea6850bbec4e960085c0c48458cb67fb164296fe16045" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.731861 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-c2lsw"] Mar 21 04:59:51 crc kubenswrapper[4775]: E0321 04:59:51.733141 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cf5461-a1ca-40cb-9ed3-9c2de90faad3" containerName="oc" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.733160 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cf5461-a1ca-40cb-9ed3-9c2de90faad3" containerName="oc" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.733300 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cf5461-a1ca-40cb-9ed3-9c2de90faad3" containerName="oc" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.733940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c2lsw" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.735407 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ngckr"] Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.736381 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.736408 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.736591 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7lq2p" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.736777 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.738854 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wdbmp" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.745749 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdtw\" (UniqueName: \"kubernetes.io/projected/4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe-kube-api-access-zsdtw\") pod \"cert-manager-858654f9db-c2lsw\" (UID: \"4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe\") " pod="cert-manager/cert-manager-858654f9db-c2lsw" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.745805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5njr\" (UniqueName: \"kubernetes.io/projected/172b2006-3394-469a-be7f-1b66d020fd45-kube-api-access-q5njr\") pod \"cert-manager-cainjector-cf98fcc89-ngckr\" (UID: \"172b2006-3394-469a-be7f-1b66d020fd45\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.757285 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ngckr"] Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.762902 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ms9hv"] Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.763599 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.765876 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-872m9" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.788179 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c2lsw"] Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.797697 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ms9hv"] Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.847762 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdtw\" (UniqueName: \"kubernetes.io/projected/4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe-kube-api-access-zsdtw\") pod \"cert-manager-858654f9db-c2lsw\" (UID: \"4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe\") " pod="cert-manager/cert-manager-858654f9db-c2lsw" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.847877 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5njr\" (UniqueName: \"kubernetes.io/projected/172b2006-3394-469a-be7f-1b66d020fd45-kube-api-access-q5njr\") pod \"cert-manager-cainjector-cf98fcc89-ngckr\" (UID: \"172b2006-3394-469a-be7f-1b66d020fd45\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.847932 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42789\" (UniqueName: \"kubernetes.io/projected/670f734f-e215-441e-9b56-7251bc7f2484-kube-api-access-42789\") pod \"cert-manager-webhook-687f57d79b-ms9hv\" (UID: \"670f734f-e215-441e-9b56-7251bc7f2484\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.873498 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5njr\" (UniqueName: \"kubernetes.io/projected/172b2006-3394-469a-be7f-1b66d020fd45-kube-api-access-q5njr\") pod \"cert-manager-cainjector-cf98fcc89-ngckr\" (UID: \"172b2006-3394-469a-be7f-1b66d020fd45\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.877291 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdtw\" (UniqueName: \"kubernetes.io/projected/4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe-kube-api-access-zsdtw\") pod \"cert-manager-858654f9db-c2lsw\" (UID: \"4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe\") " pod="cert-manager/cert-manager-858654f9db-c2lsw" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.949163 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42789\" (UniqueName: \"kubernetes.io/projected/670f734f-e215-441e-9b56-7251bc7f2484-kube-api-access-42789\") pod \"cert-manager-webhook-687f57d79b-ms9hv\" (UID: \"670f734f-e215-441e-9b56-7251bc7f2484\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" Mar 21 04:59:51 crc kubenswrapper[4775]: I0321 04:59:51.975919 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42789\" (UniqueName: \"kubernetes.io/projected/670f734f-e215-441e-9b56-7251bc7f2484-kube-api-access-42789\") pod \"cert-manager-webhook-687f57d79b-ms9hv\" (UID: \"670f734f-e215-441e-9b56-7251bc7f2484\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" Mar 21 04:59:52 crc kubenswrapper[4775]: I0321 04:59:52.055435 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c2lsw" Mar 21 04:59:52 crc kubenswrapper[4775]: I0321 04:59:52.063245 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" Mar 21 04:59:52 crc kubenswrapper[4775]: I0321 04:59:52.082437 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" Mar 21 04:59:52 crc kubenswrapper[4775]: I0321 04:59:52.312754 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ngckr"] Mar 21 04:59:52 crc kubenswrapper[4775]: I0321 04:59:52.340848 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ms9hv"] Mar 21 04:59:52 crc kubenswrapper[4775]: I0321 04:59:52.346056 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" event={"ID":"172b2006-3394-469a-be7f-1b66d020fd45","Type":"ContainerStarted","Data":"19c5215685bfb668cb35aa2a26273b0b3c60e56f7ccd9949fe72bc98d0c2e2cf"} Mar 21 04:59:52 crc kubenswrapper[4775]: W0321 04:59:52.347774 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod670f734f_e215_441e_9b56_7251bc7f2484.slice/crio-2b091bba955a87c7e5271b0c1fcb64ec94db8632dd253448d1aeb06c1f732520 WatchSource:0}: Error finding container 2b091bba955a87c7e5271b0c1fcb64ec94db8632dd253448d1aeb06c1f732520: Status 404 returned error can't find the container with id 2b091bba955a87c7e5271b0c1fcb64ec94db8632dd253448d1aeb06c1f732520 Mar 21 04:59:52 crc kubenswrapper[4775]: I0321 04:59:52.370014 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c2lsw"] Mar 21 04:59:52 crc kubenswrapper[4775]: W0321 04:59:52.378408 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fdc8b75_b0a1_4ed3_9eee_6ee726dd0fbe.slice/crio-88e9bcf5f70d457cf4a7035b4c6a662a45207ada3ef6129d5a2d0c1f5d7644be WatchSource:0}: Error finding container 88e9bcf5f70d457cf4a7035b4c6a662a45207ada3ef6129d5a2d0c1f5d7644be: Status 404 returned error can't find the container with id 88e9bcf5f70d457cf4a7035b4c6a662a45207ada3ef6129d5a2d0c1f5d7644be Mar 21 04:59:53 crc kubenswrapper[4775]: I0321 04:59:53.356174 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c2lsw" event={"ID":"4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe","Type":"ContainerStarted","Data":"88e9bcf5f70d457cf4a7035b4c6a662a45207ada3ef6129d5a2d0c1f5d7644be"} Mar 21 04:59:53 crc kubenswrapper[4775]: I0321 04:59:53.357938 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" event={"ID":"670f734f-e215-441e-9b56-7251bc7f2484","Type":"ContainerStarted","Data":"2b091bba955a87c7e5271b0c1fcb64ec94db8632dd253448d1aeb06c1f732520"} Mar 21 04:59:56 crc kubenswrapper[4775]: I0321 04:59:56.378407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" event={"ID":"172b2006-3394-469a-be7f-1b66d020fd45","Type":"ContainerStarted","Data":"c70f86ffbfe854c6689cd59af95dda0da76c62bf835fda73a1e1b4aff21123e8"} Mar 21 04:59:56 crc kubenswrapper[4775]: I0321 04:59:56.380539 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" event={"ID":"670f734f-e215-441e-9b56-7251bc7f2484","Type":"ContainerStarted","Data":"ac927c4968781bf2bbd9aaf0fb4591462f558b972010214594ad0567776ff09b"} Mar 21 04:59:56 crc kubenswrapper[4775]: I0321 04:59:56.381372 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" Mar 21 04:59:56 crc kubenswrapper[4775]: I0321 04:59:56.383445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c2lsw" event={"ID":"4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe","Type":"ContainerStarted","Data":"d1746e5e6672ac03f818cd2f4b17ba571d21e797dc6fbd87e03ac663b47fb65a"} Mar 21 04:59:56 crc kubenswrapper[4775]: I0321 04:59:56.400956 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ngckr" podStartSLOduration=1.752043246 podStartE2EDuration="5.400927666s" podCreationTimestamp="2026-03-21 04:59:51 +0000 UTC" firstStartedPulling="2026-03-21 04:59:52.323302889 +0000 UTC m=+745.299766513" lastFinishedPulling="2026-03-21 04:59:55.972187299 +0000 UTC m=+748.948650933" observedRunningTime="2026-03-21 04:59:56.395838942 +0000 UTC m=+749.372302566" watchObservedRunningTime="2026-03-21 04:59:56.400927666 +0000 UTC m=+749.377391300" Mar 21 04:59:56 crc kubenswrapper[4775]: I0321 04:59:56.464525 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-c2lsw" podStartSLOduration=1.8094235859999999 podStartE2EDuration="5.464498971s" podCreationTimestamp="2026-03-21 04:59:51 +0000 UTC" firstStartedPulling="2026-03-21 04:59:52.381695828 +0000 UTC m=+745.358159452" lastFinishedPulling="2026-03-21 04:59:56.036771213 +0000 UTC m=+749.013234837" observedRunningTime="2026-03-21 04:59:56.458515272 +0000 UTC m=+749.434978926" watchObservedRunningTime="2026-03-21 04:59:56.464498971 +0000 UTC m=+749.440962605" Mar 21 04:59:56 crc kubenswrapper[4775]: I0321 04:59:56.489052 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" podStartSLOduration=1.818157943 podStartE2EDuration="5.489026864s" podCreationTimestamp="2026-03-21 04:59:51 +0000 UTC" firstStartedPulling="2026-03-21 04:59:52.354605923 +0000 UTC m=+745.331069547" lastFinishedPulling="2026-03-21 04:59:56.025474824 +0000 UTC m=+749.001938468" observedRunningTime="2026-03-21 04:59:56.486205604 +0000 UTC m=+749.462669238" watchObservedRunningTime="2026-03-21 04:59:56.489026864 +0000 UTC m=+749.465490498" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.158015 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567820-cngbx"] Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.159564 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-cngbx" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.162290 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.162480 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.162636 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.175288 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq"] Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.176469 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-cngbx"] Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.176606 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.210894 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.212628 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.217676 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/421d78dc-59bb-4b5b-9738-1eb6a6144b38-secret-volume\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.217752 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npphs\" (UniqueName: \"kubernetes.io/projected/1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3-kube-api-access-npphs\") pod \"auto-csr-approver-29567820-cngbx\" (UID: \"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3\") " pod="openshift-infra/auto-csr-approver-29567820-cngbx" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.217826 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421d78dc-59bb-4b5b-9738-1eb6a6144b38-config-volume\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.217852 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bk98\" (UniqueName: \"kubernetes.io/projected/421d78dc-59bb-4b5b-9738-1eb6a6144b38-kube-api-access-5bk98\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.228941 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq"] Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.319463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421d78dc-59bb-4b5b-9738-1eb6a6144b38-config-volume\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.319526 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bk98\" (UniqueName: \"kubernetes.io/projected/421d78dc-59bb-4b5b-9738-1eb6a6144b38-kube-api-access-5bk98\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.319564 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/421d78dc-59bb-4b5b-9738-1eb6a6144b38-secret-volume\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.319610 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npphs\" (UniqueName: \"kubernetes.io/projected/1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3-kube-api-access-npphs\") pod \"auto-csr-approver-29567820-cngbx\" (UID: \"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3\") " pod="openshift-infra/auto-csr-approver-29567820-cngbx" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.321710 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421d78dc-59bb-4b5b-9738-1eb6a6144b38-config-volume\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.327794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/421d78dc-59bb-4b5b-9738-1eb6a6144b38-secret-volume\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.340170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bk98\" (UniqueName: \"kubernetes.io/projected/421d78dc-59bb-4b5b-9738-1eb6a6144b38-kube-api-access-5bk98\") pod \"collect-profiles-29567820-pcqhq\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.342002 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npphs\" (UniqueName: \"kubernetes.io/projected/1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3-kube-api-access-npphs\") pod \"auto-csr-approver-29567820-cngbx\" (UID: \"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3\") " pod="openshift-infra/auto-csr-approver-29567820-cngbx" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.527408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-cngbx" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.541331 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.739558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq"] Mar 21 05:00:00 crc kubenswrapper[4775]: W0321 05:00:00.747886 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421d78dc_59bb_4b5b_9738_1eb6a6144b38.slice/crio-639089e1b2eb45331baee2fe75955c82e4844add60351411ed01a1c3e360ac7d WatchSource:0}: Error finding container 639089e1b2eb45331baee2fe75955c82e4844add60351411ed01a1c3e360ac7d: Status 404 returned error can't find the container with id 639089e1b2eb45331baee2fe75955c82e4844add60351411ed01a1c3e360ac7d Mar 21 05:00:00 crc kubenswrapper[4775]: I0321 05:00:00.775058 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-cngbx"] Mar 21 05:00:00 crc kubenswrapper[4775]: W0321 05:00:00.783276 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a4faca2_a8cb_41b5_a0cf_44be1cea8fd3.slice/crio-b2b9b761df7ee4ff2e31ace5255a68fec2fe5933b83b3c71b10847ca26c3bcde WatchSource:0}: Error finding container b2b9b761df7ee4ff2e31ace5255a68fec2fe5933b83b3c71b10847ca26c3bcde: Status 404 returned error can't find the container with id b2b9b761df7ee4ff2e31ace5255a68fec2fe5933b83b3c71b10847ca26c3bcde Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.424184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-cngbx" event={"ID":"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3","Type":"ContainerStarted","Data":"b2b9b761df7ee4ff2e31ace5255a68fec2fe5933b83b3c71b10847ca26c3bcde"} Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.426254 4775 generic.go:334] "Generic (PLEG): container finished" podID="421d78dc-59bb-4b5b-9738-1eb6a6144b38" containerID="19febf6fd2202a1482d38e4d840812756bb31244e9ab9826c5c08685c9d1d70a" exitCode=0 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.426284 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" event={"ID":"421d78dc-59bb-4b5b-9738-1eb6a6144b38","Type":"ContainerDied","Data":"19febf6fd2202a1482d38e4d840812756bb31244e9ab9826c5c08685c9d1d70a"} Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.426302 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" event={"ID":"421d78dc-59bb-4b5b-9738-1eb6a6144b38","Type":"ContainerStarted","Data":"639089e1b2eb45331baee2fe75955c82e4844add60351411ed01a1c3e360ac7d"} Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.683282 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mzqtk"] Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.683720 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-controller" containerID="cri-o://5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.684108 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="sbdb" containerID="cri-o://2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.684183 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="nbdb" containerID="cri-o://5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.684223 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="northd" containerID="cri-o://f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.684261 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.684304 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-node" containerID="cri-o://6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.684342 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-acl-logging" containerID="cri-o://6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.734949 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" containerID="cri-o://294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" gracePeriod=30 Mar 21 05:00:01 crc kubenswrapper[4775]: I0321 05:00:01.998234 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/3.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.000854 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovn-acl-logging/0.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.001467 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovn-controller/0.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.001995 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.039768 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-systemd\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.039835 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-log-socket\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.039881 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-config\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.039961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-var-lib-openvswitch\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.039996 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-netns\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040018 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040040 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-env-overrides\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040060 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69d31f5-deeb-4860-be96-ed5547831685-ovn-node-metrics-cert\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040083 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-slash\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040106 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-kubelet\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040159 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-etc-openvswitch\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040179 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-node-log\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040318 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-slash" (OuterVolumeSpecName: "host-slash") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040595 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040655 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040632 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-log-socket" (OuterVolumeSpecName: "log-socket") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040727 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040821 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.040848 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-node-log" (OuterVolumeSpecName: "node-log") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041168 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-ovn\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041275 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-ovn-kubernetes\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041298 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-bin\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041327 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-netd\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041346 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-systemd-units\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041362 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041395 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041374 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041426 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041452 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-openvswitch\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041494 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7w6b\" (UniqueName: \"kubernetes.io/projected/a69d31f5-deeb-4860-be96-ed5547831685-kube-api-access-h7w6b\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041524 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.041556 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-script-lib\") pod \"a69d31f5-deeb-4860-be96-ed5547831685\" (UID: \"a69d31f5-deeb-4860-be96-ed5547831685\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042022 4775 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-log-socket\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042057 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042205 4775 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042304 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042361 4775 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042379 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042397 4775 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-slash\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042413 4775 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042429 4775 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042445 4775 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-node-log\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042459 4775 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042474 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042492 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042507 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042521 4775 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042536 4775 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.042609 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.046009 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a69d31f5-deeb-4860-be96-ed5547831685-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.046640 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69d31f5-deeb-4860-be96-ed5547831685-kube-api-access-h7w6b" (OuterVolumeSpecName: "kube-api-access-h7w6b") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "kube-api-access-h7w6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.065822 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sbftc"] Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066471 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kubecfg-setup" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066497 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kubecfg-setup" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066518 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066541 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066550 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066560 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066576 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066595 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066611 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="nbdb" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066621 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="nbdb" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066643 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066656 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066666 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066674 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066703 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066713 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066727 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-acl-logging" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066735 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-acl-logging" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066750 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-node" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066759 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-node" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066771 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="northd" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066779 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="northd" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.066793 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="sbdb" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066801 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="sbdb" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066942 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="nbdb" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066958 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066970 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066983 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="sbdb" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.066996 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067006 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067015 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="kube-rbac-proxy-node" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067028 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-acl-logging" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067039 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067048 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="northd" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067057 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovn-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.067231 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067242 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.067368 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69d31f5-deeb-4860-be96-ed5547831685" containerName="ovnkube-controller" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.069408 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a69d31f5-deeb-4860-be96-ed5547831685" (UID: "a69d31f5-deeb-4860-be96-ed5547831685"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.069580 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.085240 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-ms9hv" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.144103 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rbww\" (UniqueName: \"kubernetes.io/projected/82b41c52-e915-4cf3-808e-79211ba01fce-kube-api-access-5rbww\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.144512 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-systemd-units\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.144673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-var-lib-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.144778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-slash\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.144870 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-ovn\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.144967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.145064 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-run-netns\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.145180 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-kubelet\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.145297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-ovnkube-config\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.145440 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-cni-bin\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.145590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-env-overrides\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.145743 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82b41c52-e915-4cf3-808e-79211ba01fce-ovn-node-metrics-cert\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.145884 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-etc-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.146029 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-run-ovn-kubernetes\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.146209 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-systemd\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.146368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-ovnkube-script-lib\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.146507 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-node-log\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.146642 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.146780 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-cni-netd\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.146925 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-log-socket\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.147221 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69d31f5-deeb-4860-be96-ed5547831685-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.147321 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7w6b\" (UniqueName: \"kubernetes.io/projected/a69d31f5-deeb-4860-be96-ed5547831685-kube-api-access-h7w6b\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.147411 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69d31f5-deeb-4860-be96-ed5547831685-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.147482 4775 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69d31f5-deeb-4860-be96-ed5547831685-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248394 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-cni-netd\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248455 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-log-socket\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248484 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rbww\" (UniqueName: \"kubernetes.io/projected/82b41c52-e915-4cf3-808e-79211ba01fce-kube-api-access-5rbww\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-systemd-units\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-var-lib-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248521 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248588 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-slash\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248541 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-slash\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248630 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-log-socket\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-cni-netd\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248879 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-ovn\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248917 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-var-lib-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-systemd-units\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248964 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-run-netns\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.248990 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-ovn\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249000 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-kubelet\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-run-netns\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249035 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-ovnkube-config\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-kubelet\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-cni-bin\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249224 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-env-overrides\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249254 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82b41c52-e915-4cf3-808e-79211ba01fce-ovn-node-metrics-cert\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249292 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-etc-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-run-ovn-kubernetes\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249327 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-cni-bin\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-etc-openvswitch\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-systemd\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249430 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-run-systemd\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249473 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-host-run-ovn-kubernetes\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249490 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-ovnkube-script-lib\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-node-log\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249625 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82b41c52-e915-4cf3-808e-79211ba01fce-node-log\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.249725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-ovnkube-config\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.250169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-env-overrides\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.250656 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82b41c52-e915-4cf3-808e-79211ba01fce-ovnkube-script-lib\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.255078 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82b41c52-e915-4cf3-808e-79211ba01fce-ovn-node-metrics-cert\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.272318 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rbww\" (UniqueName: \"kubernetes.io/projected/82b41c52-e915-4cf3-808e-79211ba01fce-kube-api-access-5rbww\") pod \"ovnkube-node-sbftc\" (UID: \"82b41c52-e915-4cf3-808e-79211ba01fce\") " pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.386633 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.437098 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovnkube-controller/3.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.439810 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovn-acl-logging/0.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.440621 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mzqtk_a69d31f5-deeb-4860-be96-ed5547831685/ovn-controller/0.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441173 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441214 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441231 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441250 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441266 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441279 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441294 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" exitCode=143 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441311 4775 generic.go:334] "Generic (PLEG): container finished" podID="a69d31f5-deeb-4860-be96-ed5547831685" containerID="5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" exitCode=143 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441387 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441431 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441454 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441473 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441493 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441505 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441527 4775 scope.go:117] "RemoveContainer" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441512 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441743 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441760 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441772 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441787 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441801 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441816 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441829 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441844 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441859 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441882 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441907 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441923 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441937 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441953 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441966 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441979 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.441993 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442006 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442020 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442035 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442055 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442078 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442094 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442109 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442206 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442218 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442229 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442240 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442251 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442262 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442273 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442288 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqtk" event={"ID":"a69d31f5-deeb-4860-be96-ed5547831685","Type":"ContainerDied","Data":"a053283b74d1089f3887c9fbc93ee3063db45e363837d9c9231a5cf45771e07b"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442306 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442320 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442331 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442342 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442353 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442364 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442375 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442387 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442398 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.442409 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.443723 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"b1c177be1dcd02031cb1c2daefe3b54226235529cfff4a15959f6d5ad20824e0"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.447182 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/2.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.448162 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/1.log" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.448219 4775 generic.go:334] "Generic (PLEG): container finished" podID="e77ec218-42da-4f07-b214-184c4f3b20f3" containerID="ad12949f26afe1756d4a6c0d01069cb26a8928bb36cc24602d4ab1bbde117f9e" exitCode=2 Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.448373 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerDied","Data":"ad12949f26afe1756d4a6c0d01069cb26a8928bb36cc24602d4ab1bbde117f9e"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.448483 4775 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae"} Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.449103 4775 scope.go:117] "RemoveContainer" containerID="ad12949f26afe1756d4a6c0d01069cb26a8928bb36cc24602d4ab1bbde117f9e" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.449395 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-556rg_openshift-multus(e77ec218-42da-4f07-b214-184c4f3b20f3)\"" pod="openshift-multus/multus-556rg" podUID="e77ec218-42da-4f07-b214-184c4f3b20f3" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.480965 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.536851 4775 scope.go:117] "RemoveContainer" containerID="2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.537234 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.546493 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mzqtk"] Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.550761 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mzqtk"] Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.552381 4775 scope.go:117] "RemoveContainer" containerID="5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.572404 4775 scope.go:117] "RemoveContainer" containerID="f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.590373 4775 scope.go:117] "RemoveContainer" containerID="70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.653492 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/421d78dc-59bb-4b5b-9738-1eb6a6144b38-secret-volume\") pod \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.653645 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421d78dc-59bb-4b5b-9738-1eb6a6144b38-config-volume\") pod \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.653706 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bk98\" (UniqueName: \"kubernetes.io/projected/421d78dc-59bb-4b5b-9738-1eb6a6144b38-kube-api-access-5bk98\") pod \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\" (UID: \"421d78dc-59bb-4b5b-9738-1eb6a6144b38\") " Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.654398 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/421d78dc-59bb-4b5b-9738-1eb6a6144b38-config-volume" (OuterVolumeSpecName: "config-volume") pod "421d78dc-59bb-4b5b-9738-1eb6a6144b38" (UID: "421d78dc-59bb-4b5b-9738-1eb6a6144b38"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.657285 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d78dc-59bb-4b5b-9738-1eb6a6144b38-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "421d78dc-59bb-4b5b-9738-1eb6a6144b38" (UID: "421d78dc-59bb-4b5b-9738-1eb6a6144b38"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.657951 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421d78dc-59bb-4b5b-9738-1eb6a6144b38-kube-api-access-5bk98" (OuterVolumeSpecName: "kube-api-access-5bk98") pod "421d78dc-59bb-4b5b-9738-1eb6a6144b38" (UID: "421d78dc-59bb-4b5b-9738-1eb6a6144b38"). InnerVolumeSpecName "kube-api-access-5bk98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.658638 4775 scope.go:117] "RemoveContainer" containerID="6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.673679 4775 scope.go:117] "RemoveContainer" containerID="6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.689786 4775 scope.go:117] "RemoveContainer" containerID="5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.701908 4775 scope.go:117] "RemoveContainer" containerID="effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.716739 4775 scope.go:117] "RemoveContainer" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.717276 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": container with ID starting with 294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646 not found: ID does not exist" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.717324 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} err="failed to get container status \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": rpc error: code = NotFound desc = could not find container \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": container with ID starting with 294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.717352 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.717675 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": container with ID starting with 7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a not found: ID does not exist" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.717718 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} err="failed to get container status \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": rpc error: code = NotFound desc = could not find container \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": container with ID starting with 7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.717746 4775 scope.go:117] "RemoveContainer" containerID="2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.718030 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": container with ID starting with 2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d not found: ID does not exist" containerID="2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.718059 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} err="failed to get container status \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": rpc error: code = NotFound desc = could not find container \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": container with ID starting with 2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.718073 4775 scope.go:117] "RemoveContainer" containerID="5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.718517 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": container with ID starting with 5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99 not found: ID does not exist" containerID="5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.718544 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} err="failed to get container status \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": rpc error: code = NotFound desc = could not find container \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": container with ID starting with 5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.718564 4775 scope.go:117] "RemoveContainer" containerID="f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.718820 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": container with ID starting with f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28 not found: ID does not exist" containerID="f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.718846 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} err="failed to get container status \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": rpc error: code = NotFound desc = could not find container \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": container with ID starting with f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.718859 4775 scope.go:117] "RemoveContainer" containerID="70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.719091 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": container with ID starting with 70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4 not found: ID does not exist" containerID="70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.719109 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} err="failed to get container status \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": rpc error: code = NotFound desc = could not find container \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": container with ID starting with 70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.719143 4775 scope.go:117] "RemoveContainer" containerID="6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.719492 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": container with ID starting with 6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc not found: ID does not exist" containerID="6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.719512 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} err="failed to get container status \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": rpc error: code = NotFound desc = could not find container \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": container with ID starting with 6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.719525 4775 scope.go:117] "RemoveContainer" containerID="6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.719734 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": container with ID starting with 6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c not found: ID does not exist" containerID="6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.719758 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} err="failed to get container status \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": rpc error: code = NotFound desc = could not find container \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": container with ID starting with 6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.719771 4775 scope.go:117] "RemoveContainer" containerID="5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.720004 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": container with ID starting with 5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde not found: ID does not exist" containerID="5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720025 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} err="failed to get container status \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": rpc error: code = NotFound desc = could not find container \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": container with ID starting with 5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720041 4775 scope.go:117] "RemoveContainer" containerID="effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6" Mar 21 05:00:02 crc kubenswrapper[4775]: E0321 05:00:02.720299 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": container with ID starting with effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6 not found: ID does not exist" containerID="effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720331 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} err="failed to get container status \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": rpc error: code = NotFound desc = could not find container \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": container with ID starting with effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720349 4775 scope.go:117] "RemoveContainer" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720564 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} err="failed to get container status \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": rpc error: code = NotFound desc = could not find container \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": container with ID starting with 294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720587 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720767 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} err="failed to get container status \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": rpc error: code = NotFound desc = could not find container \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": container with ID starting with 7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.720788 4775 scope.go:117] "RemoveContainer" containerID="2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721054 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} err="failed to get container status \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": rpc error: code = NotFound desc = could not find container \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": container with ID starting with 2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721094 4775 scope.go:117] "RemoveContainer" containerID="5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721335 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} err="failed to get container status \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": rpc error: code = NotFound desc = could not find container \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": container with ID starting with 5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721355 4775 scope.go:117] "RemoveContainer" containerID="f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721600 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} err="failed to get container status \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": rpc error: code = NotFound desc = could not find container \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": container with ID starting with f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721627 4775 scope.go:117] "RemoveContainer" containerID="70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721870 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} err="failed to get container status \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": rpc error: code = NotFound desc = could not find container \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": container with ID starting with 70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.721911 4775 scope.go:117] "RemoveContainer" containerID="6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.722374 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} err="failed to get container status \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": rpc error: code = NotFound desc = could not find container \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": container with ID starting with 6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.722397 4775 scope.go:117] "RemoveContainer" containerID="6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.722668 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} err="failed to get container status \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": rpc error: code = NotFound desc = could not find container \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": container with ID starting with 6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.722692 4775 scope.go:117] "RemoveContainer" containerID="5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.722874 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} err="failed to get container status \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": rpc error: code = NotFound desc = could not find container \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": container with ID starting with 5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.722894 4775 scope.go:117] "RemoveContainer" containerID="effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723101 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} err="failed to get container status \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": rpc error: code = NotFound desc = could not find container \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": container with ID starting with effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723135 4775 scope.go:117] "RemoveContainer" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723375 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} err="failed to get container status \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": rpc error: code = NotFound desc = could not find container \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": container with ID starting with 294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723395 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723687 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} err="failed to get container status \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": rpc error: code = NotFound desc = could not find container \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": container with ID starting with 7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723729 4775 scope.go:117] "RemoveContainer" containerID="2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723961 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} err="failed to get container status \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": rpc error: code = NotFound desc = could not find container \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": container with ID starting with 2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.723983 4775 scope.go:117] "RemoveContainer" containerID="5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724195 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} err="failed to get container status \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": rpc error: code = NotFound desc = could not find container \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": container with ID starting with 5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724212 4775 scope.go:117] "RemoveContainer" containerID="f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724462 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} err="failed to get container status \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": rpc error: code = NotFound desc = could not find container \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": container with ID starting with f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724511 4775 scope.go:117] "RemoveContainer" containerID="70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724731 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} err="failed to get container status \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": rpc error: code = NotFound desc = could not find container \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": container with ID starting with 70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724750 4775 scope.go:117] "RemoveContainer" containerID="6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724973 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} err="failed to get container status \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": rpc error: code = NotFound desc = could not find container \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": container with ID starting with 6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.724993 4775 scope.go:117] "RemoveContainer" containerID="6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.725195 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} err="failed to get container status \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": rpc error: code = NotFound desc = could not find container \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": container with ID starting with 6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.725214 4775 scope.go:117] "RemoveContainer" containerID="5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.725440 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} err="failed to get container status \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": rpc error: code = NotFound desc = could not find container \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": container with ID starting with 5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.725464 4775 scope.go:117] "RemoveContainer" containerID="effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.725745 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} err="failed to get container status \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": rpc error: code = NotFound desc = could not find container \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": container with ID starting with effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.725783 4775 scope.go:117] "RemoveContainer" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726030 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} err="failed to get container status \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": rpc error: code = NotFound desc = could not find container \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": container with ID starting with 294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726059 4775 scope.go:117] "RemoveContainer" containerID="7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726307 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a"} err="failed to get container status \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": rpc error: code = NotFound desc = could not find container \"7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a\": container with ID starting with 7d306fe24fb21d9e53a0846533716c9f52156736305200511b0df65cf5c69a6a not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726335 4775 scope.go:117] "RemoveContainer" containerID="2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726607 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d"} err="failed to get container status \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": rpc error: code = NotFound desc = could not find container \"2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d\": container with ID starting with 2b79e5b58e638b0f064cac5a56ff8e687cd0db891a073c26d1444706639d567d not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726632 4775 scope.go:117] "RemoveContainer" containerID="5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726863 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99"} err="failed to get container status \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": rpc error: code = NotFound desc = could not find container \"5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99\": container with ID starting with 5a45f98fdb325d5650858b334451985167a08d8094ca5d4764b38f065ac3be99 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.726901 4775 scope.go:117] "RemoveContainer" containerID="f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727183 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28"} err="failed to get container status \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": rpc error: code = NotFound desc = could not find container \"f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28\": container with ID starting with f010836ee58725ca90fcfe1494b3ab9f696ee327b47a27a5108136724767fa28 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727205 4775 scope.go:117] "RemoveContainer" containerID="70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727415 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4"} err="failed to get container status \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": rpc error: code = NotFound desc = could not find container \"70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4\": container with ID starting with 70bb52f4030848dda1a76c85b6892ea6a3846de0f00e2c30c578383eacbd58f4 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727471 4775 scope.go:117] "RemoveContainer" containerID="6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727698 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc"} err="failed to get container status \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": rpc error: code = NotFound desc = could not find container \"6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc\": container with ID starting with 6726ea0f5d8c94e4b5308e327560a3e42cc8ad012c1df59a23d9a1f9df2758cc not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727721 4775 scope.go:117] "RemoveContainer" containerID="6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727940 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c"} err="failed to get container status \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": rpc error: code = NotFound desc = could not find container \"6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c\": container with ID starting with 6e99d8e95a7ea25573def712cc98bb5c7e2f2f69d38511bd2ab4742e91db296c not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.727961 4775 scope.go:117] "RemoveContainer" containerID="5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.728187 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde"} err="failed to get container status \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": rpc error: code = NotFound desc = could not find container \"5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde\": container with ID starting with 5c1c1bdc7ba87dbea213b904e9e6422dc5a7ba2870df331886a3351a9ab6ddde not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.728224 4775 scope.go:117] "RemoveContainer" containerID="effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.728495 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6"} err="failed to get container status \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": rpc error: code = NotFound desc = could not find container \"effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6\": container with ID starting with effb207e18cc86a07c45d973b406a356cac22f7c470b51edfb44907c03dddea6 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.728524 4775 scope.go:117] "RemoveContainer" containerID="294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.728857 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646"} err="failed to get container status \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": rpc error: code = NotFound desc = could not find container \"294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646\": container with ID starting with 294e2a2d654e28444b7802aa1d470800d7c15c3f6dd9a43a454fb0f777f57646 not found: ID does not exist" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.755041 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bk98\" (UniqueName: \"kubernetes.io/projected/421d78dc-59bb-4b5b-9738-1eb6a6144b38-kube-api-access-5bk98\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.755083 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/421d78dc-59bb-4b5b-9738-1eb6a6144b38-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:02 crc kubenswrapper[4775]: I0321 05:00:02.755104 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421d78dc-59bb-4b5b-9738-1eb6a6144b38-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:03 crc kubenswrapper[4775]: I0321 05:00:03.454702 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" Mar 21 05:00:03 crc kubenswrapper[4775]: I0321 05:00:03.454705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq" event={"ID":"421d78dc-59bb-4b5b-9738-1eb6a6144b38","Type":"ContainerDied","Data":"639089e1b2eb45331baee2fe75955c82e4844add60351411ed01a1c3e360ac7d"} Mar 21 05:00:03 crc kubenswrapper[4775]: I0321 05:00:03.454743 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639089e1b2eb45331baee2fe75955c82e4844add60351411ed01a1c3e360ac7d" Mar 21 05:00:03 crc kubenswrapper[4775]: I0321 05:00:03.457670 4775 generic.go:334] "Generic (PLEG): container finished" podID="82b41c52-e915-4cf3-808e-79211ba01fce" containerID="944c691144c6d64318b896ba824894bb31ce3c8bf580a8c01deaa262441a63c9" exitCode=0 Mar 21 05:00:03 crc kubenswrapper[4775]: I0321 05:00:03.457736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerDied","Data":"944c691144c6d64318b896ba824894bb31ce3c8bf580a8c01deaa262441a63c9"} Mar 21 05:00:03 crc kubenswrapper[4775]: I0321 05:00:03.669167 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69d31f5-deeb-4860-be96-ed5547831685" path="/var/lib/kubelet/pods/a69d31f5-deeb-4860-be96-ed5547831685/volumes" Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.467414 4775 generic.go:334] "Generic (PLEG): container finished" podID="1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3" containerID="7d8eac46771bcd3a496da8e38bf914f7acad2a5e17c1713aea69b3a4abe8ee15" exitCode=0 Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.467725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-cngbx" event={"ID":"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3","Type":"ContainerDied","Data":"7d8eac46771bcd3a496da8e38bf914f7acad2a5e17c1713aea69b3a4abe8ee15"} Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.471868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"3ce7714c061cb7e728deed8361a9b8e649c04edc7ed358977f0df05a349b7f98"} Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.471912 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"be86b922fb3a3004ef698fa897239834e25dc954ff78dba5132dd682a8fd24c8"} Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.471927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"80139f9faa1db99b3a577abaf2fa3340e452f86556905d28e07464685a43c73e"} Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.471939 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"43c4bd2d6babd3920efc2542cee5a518b18a3b87f7abfd642a1e5298d3a51c5b"} Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.471950 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"c3fed84362c1e40dd246975c6190af593b99ae7ef29801a7f05991178af0508a"} Mar 21 05:00:04 crc kubenswrapper[4775]: I0321 05:00:04.471962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"e0ad820fd3d42363ea591d67aeaadfee76254b32094ddf56c05606ae5ef70d74"} Mar 21 05:00:05 crc kubenswrapper[4775]: I0321 05:00:05.560606 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-cngbx" Mar 21 05:00:05 crc kubenswrapper[4775]: I0321 05:00:05.591313 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npphs\" (UniqueName: \"kubernetes.io/projected/1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3-kube-api-access-npphs\") pod \"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3\" (UID: \"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3\") " Mar 21 05:00:05 crc kubenswrapper[4775]: I0321 05:00:05.597201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3-kube-api-access-npphs" (OuterVolumeSpecName: "kube-api-access-npphs") pod "1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3" (UID: "1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3"). InnerVolumeSpecName "kube-api-access-npphs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:05 crc kubenswrapper[4775]: I0321 05:00:05.693365 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npphs\" (UniqueName: \"kubernetes.io/projected/1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3-kube-api-access-npphs\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:06 crc kubenswrapper[4775]: I0321 05:00:06.490587 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-cngbx" event={"ID":"1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3","Type":"ContainerDied","Data":"b2b9b761df7ee4ff2e31ace5255a68fec2fe5933b83b3c71b10847ca26c3bcde"} Mar 21 05:00:06 crc kubenswrapper[4775]: I0321 05:00:06.491175 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2b9b761df7ee4ff2e31ace5255a68fec2fe5933b83b3c71b10847ca26c3bcde" Mar 21 05:00:06 crc kubenswrapper[4775]: I0321 05:00:06.490606 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-cngbx" Mar 21 05:00:06 crc kubenswrapper[4775]: I0321 05:00:06.495753 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"4a835e9d20b66c10c58853a57bd29b47c4c381a131c3cfe660e54e8602509a9d"} Mar 21 05:00:06 crc kubenswrapper[4775]: I0321 05:00:06.617085 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-fvs8k"] Mar 21 05:00:06 crc kubenswrapper[4775]: I0321 05:00:06.623595 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-fvs8k"] Mar 21 05:00:07 crc kubenswrapper[4775]: I0321 05:00:07.668284 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f9e3cb-7583-4349-98dc-422a7880fe5e" path="/var/lib/kubelet/pods/e8f9e3cb-7583-4349-98dc-422a7880fe5e/volumes" Mar 21 05:00:09 crc kubenswrapper[4775]: I0321 05:00:09.514915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" event={"ID":"82b41c52-e915-4cf3-808e-79211ba01fce","Type":"ContainerStarted","Data":"c7c7c05ece6a5336c9e4fa813e81226180e55a9c7585dc95d8cc4824c277ca42"} Mar 21 05:00:09 crc kubenswrapper[4775]: I0321 05:00:09.515612 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:09 crc kubenswrapper[4775]: I0321 05:00:09.515630 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:09 crc kubenswrapper[4775]: I0321 05:00:09.547972 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:09 crc kubenswrapper[4775]: I0321 05:00:09.553144 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" podStartSLOduration=7.5531060530000005 podStartE2EDuration="7.553106053s" podCreationTimestamp="2026-03-21 05:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:00:09.550836079 +0000 UTC m=+762.527299713" watchObservedRunningTime="2026-03-21 05:00:09.553106053 +0000 UTC m=+762.529569687" Mar 21 05:00:10 crc kubenswrapper[4775]: I0321 05:00:10.522105 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:10 crc kubenswrapper[4775]: I0321 05:00:10.573295 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:16 crc kubenswrapper[4775]: I0321 05:00:16.661682 4775 scope.go:117] "RemoveContainer" containerID="ad12949f26afe1756d4a6c0d01069cb26a8928bb36cc24602d4ab1bbde117f9e" Mar 21 05:00:16 crc kubenswrapper[4775]: E0321 05:00:16.662351 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-556rg_openshift-multus(e77ec218-42da-4f07-b214-184c4f3b20f3)\"" pod="openshift-multus/multus-556rg" podUID="e77ec218-42da-4f07-b214-184c4f3b20f3" Mar 21 05:00:28 crc kubenswrapper[4775]: I0321 05:00:28.661930 4775 scope.go:117] "RemoveContainer" containerID="ad12949f26afe1756d4a6c0d01069cb26a8928bb36cc24602d4ab1bbde117f9e" Mar 21 05:00:29 crc kubenswrapper[4775]: I0321 05:00:29.664841 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/2.log" Mar 21 05:00:29 crc kubenswrapper[4775]: I0321 05:00:29.666883 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/1.log" Mar 21 05:00:29 crc kubenswrapper[4775]: I0321 05:00:29.667466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-556rg" event={"ID":"e77ec218-42da-4f07-b214-184c4f3b20f3","Type":"ContainerStarted","Data":"0db356b6d1097c58338f09b442231119fce13c2c3522c04f201fc30e7c3911fd"} Mar 21 05:00:32 crc kubenswrapper[4775]: I0321 05:00:32.421969 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sbftc" Mar 21 05:00:32 crc kubenswrapper[4775]: I0321 05:00:32.481839 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:00:32 crc kubenswrapper[4775]: I0321 05:00:32.481910 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:00:37 crc kubenswrapper[4775]: I0321 05:00:37.179009 4775 scope.go:117] "RemoveContainer" containerID="55e31feec3221e168e396a1b14aac3afcd926bdfa911a175f057a586cb60827a" Mar 21 05:00:37 crc kubenswrapper[4775]: I0321 05:00:37.221325 4775 scope.go:117] "RemoveContainer" containerID="eaf83cd6f4a806265a871a5672f1a6939180308c2815b87ca254164f26d05aae" Mar 21 05:00:38 crc kubenswrapper[4775]: I0321 05:00:38.191040 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-556rg_e77ec218-42da-4f07-b214-184c4f3b20f3/kube-multus/2.log" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.123841 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp"] Mar 21 05:00:40 crc kubenswrapper[4775]: E0321 05:00:40.124459 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421d78dc-59bb-4b5b-9738-1eb6a6144b38" containerName="collect-profiles" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.124476 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="421d78dc-59bb-4b5b-9738-1eb6a6144b38" containerName="collect-profiles" Mar 21 05:00:40 crc kubenswrapper[4775]: E0321 05:00:40.124500 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3" containerName="oc" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.124508 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3" containerName="oc" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.124629 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3" containerName="oc" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.124650 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="421d78dc-59bb-4b5b-9738-1eb6a6144b38" containerName="collect-profiles" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.125425 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.128010 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.135091 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp"] Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.219213 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfpl\" (UniqueName: \"kubernetes.io/projected/b2ca57eb-edac-4db7-bfa4-3198d80daf99-kube-api-access-4rfpl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.219307 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.219390 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.320088 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.320203 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfpl\" (UniqueName: \"kubernetes.io/projected/b2ca57eb-edac-4db7-bfa4-3198d80daf99-kube-api-access-4rfpl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.320225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.320709 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.320949 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.338052 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfpl\" (UniqueName: \"kubernetes.io/projected/b2ca57eb-edac-4db7-bfa4-3198d80daf99-kube-api-access-4rfpl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.439459 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:40 crc kubenswrapper[4775]: I0321 05:00:40.620589 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp"] Mar 21 05:00:41 crc kubenswrapper[4775]: I0321 05:00:41.213096 4775 generic.go:334] "Generic (PLEG): container finished" podID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerID="36a131f8b044d35d9bd3c10c1ce29ee06e1b450f9a0c4659063f6c4ddd7f5ec1" exitCode=0 Mar 21 05:00:41 crc kubenswrapper[4775]: I0321 05:00:41.213815 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" event={"ID":"b2ca57eb-edac-4db7-bfa4-3198d80daf99","Type":"ContainerDied","Data":"36a131f8b044d35d9bd3c10c1ce29ee06e1b450f9a0c4659063f6c4ddd7f5ec1"} Mar 21 05:00:41 crc kubenswrapper[4775]: I0321 05:00:41.213905 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" event={"ID":"b2ca57eb-edac-4db7-bfa4-3198d80daf99","Type":"ContainerStarted","Data":"e01461b988824f083ab26e81f6c7aec2773113cbff030f0de9786bda5a2952a6"} Mar 21 05:00:43 crc kubenswrapper[4775]: I0321 05:00:43.226165 4775 generic.go:334] "Generic (PLEG): container finished" podID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerID="88a56d96f0d83816ad532652ef039719ef3427187d9249736e96283a01f5026a" exitCode=0 Mar 21 05:00:43 crc kubenswrapper[4775]: I0321 05:00:43.226239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" event={"ID":"b2ca57eb-edac-4db7-bfa4-3198d80daf99","Type":"ContainerDied","Data":"88a56d96f0d83816ad532652ef039719ef3427187d9249736e96283a01f5026a"} Mar 21 05:00:44 crc kubenswrapper[4775]: I0321 05:00:44.237899 4775 generic.go:334] "Generic (PLEG): container finished" podID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerID="6247146119e67cb4d90c830a7ac3fc2c13acc7a731c50205e40dffabbe26a488" exitCode=0 Mar 21 05:00:44 crc kubenswrapper[4775]: I0321 05:00:44.237966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" event={"ID":"b2ca57eb-edac-4db7-bfa4-3198d80daf99","Type":"ContainerDied","Data":"6247146119e67cb4d90c830a7ac3fc2c13acc7a731c50205e40dffabbe26a488"} Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.446799 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.510967 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-util\") pod \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.511035 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rfpl\" (UniqueName: \"kubernetes.io/projected/b2ca57eb-edac-4db7-bfa4-3198d80daf99-kube-api-access-4rfpl\") pod \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.511078 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-bundle\") pod \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\" (UID: \"b2ca57eb-edac-4db7-bfa4-3198d80daf99\") " Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.511840 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-bundle" (OuterVolumeSpecName: "bundle") pod "b2ca57eb-edac-4db7-bfa4-3198d80daf99" (UID: "b2ca57eb-edac-4db7-bfa4-3198d80daf99"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.522388 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ca57eb-edac-4db7-bfa4-3198d80daf99-kube-api-access-4rfpl" (OuterVolumeSpecName: "kube-api-access-4rfpl") pod "b2ca57eb-edac-4db7-bfa4-3198d80daf99" (UID: "b2ca57eb-edac-4db7-bfa4-3198d80daf99"). InnerVolumeSpecName "kube-api-access-4rfpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.525260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-util" (OuterVolumeSpecName: "util") pod "b2ca57eb-edac-4db7-bfa4-3198d80daf99" (UID: "b2ca57eb-edac-4db7-bfa4-3198d80daf99"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.612463 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-util\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.612502 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rfpl\" (UniqueName: \"kubernetes.io/projected/b2ca57eb-edac-4db7-bfa4-3198d80daf99-kube-api-access-4rfpl\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:45 crc kubenswrapper[4775]: I0321 05:00:45.612512 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2ca57eb-edac-4db7-bfa4-3198d80daf99-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:46 crc kubenswrapper[4775]: I0321 05:00:46.256453 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" event={"ID":"b2ca57eb-edac-4db7-bfa4-3198d80daf99","Type":"ContainerDied","Data":"e01461b988824f083ab26e81f6c7aec2773113cbff030f0de9786bda5a2952a6"} Mar 21 05:00:46 crc kubenswrapper[4775]: I0321 05:00:46.256546 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e01461b988824f083ab26e81f6c7aec2773113cbff030f0de9786bda5a2952a6" Mar 21 05:00:46 crc kubenswrapper[4775]: I0321 05:00:46.256564 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.404913 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-xljlp"] Mar 21 05:00:48 crc kubenswrapper[4775]: E0321 05:00:48.405482 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerName="util" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.405502 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerName="util" Mar 21 05:00:48 crc kubenswrapper[4775]: E0321 05:00:48.405526 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerName="extract" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.405537 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerName="extract" Mar 21 05:00:48 crc kubenswrapper[4775]: E0321 05:00:48.405554 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerName="pull" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.405564 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerName="pull" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.405714 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ca57eb-edac-4db7-bfa4-3198d80daf99" containerName="extract" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.406178 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.407603 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-26vmx" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.407761 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.407870 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.416612 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-xljlp"] Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.549228 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npn4x\" (UniqueName: \"kubernetes.io/projected/f4a0a79a-5b67-44f8-9ef5-304530d5e764-kube-api-access-npn4x\") pod \"nmstate-operator-796d4cfff4-xljlp\" (UID: \"f4a0a79a-5b67-44f8-9ef5-304530d5e764\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.650656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npn4x\" (UniqueName: \"kubernetes.io/projected/f4a0a79a-5b67-44f8-9ef5-304530d5e764-kube-api-access-npn4x\") pod \"nmstate-operator-796d4cfff4-xljlp\" (UID: \"f4a0a79a-5b67-44f8-9ef5-304530d5e764\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.669908 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npn4x\" (UniqueName: \"kubernetes.io/projected/f4a0a79a-5b67-44f8-9ef5-304530d5e764-kube-api-access-npn4x\") pod \"nmstate-operator-796d4cfff4-xljlp\" (UID: \"f4a0a79a-5b67-44f8-9ef5-304530d5e764\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.721554 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" Mar 21 05:00:48 crc kubenswrapper[4775]: I0321 05:00:48.915437 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-xljlp"] Mar 21 05:00:49 crc kubenswrapper[4775]: I0321 05:00:49.273248 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" event={"ID":"f4a0a79a-5b67-44f8-9ef5-304530d5e764","Type":"ContainerStarted","Data":"a85340a544736d29c8ba4919cf40b85e8a8702c0d41080670e1974a9f1db9478"} Mar 21 05:00:52 crc kubenswrapper[4775]: I0321 05:00:52.299068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" event={"ID":"f4a0a79a-5b67-44f8-9ef5-304530d5e764","Type":"ContainerStarted","Data":"d17303fb14c393392fb271c5ccb948311cc98edae2519ee1050ffcbb981fdfb3"} Mar 21 05:00:52 crc kubenswrapper[4775]: I0321 05:00:52.321251 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xljlp" podStartSLOduration=1.691041025 podStartE2EDuration="4.32123131s" podCreationTimestamp="2026-03-21 05:00:48 +0000 UTC" firstStartedPulling="2026-03-21 05:00:48.919628472 +0000 UTC m=+801.896092096" lastFinishedPulling="2026-03-21 05:00:51.549818757 +0000 UTC m=+804.526282381" observedRunningTime="2026-03-21 05:00:52.318066301 +0000 UTC m=+805.294529935" watchObservedRunningTime="2026-03-21 05:00:52.32123131 +0000 UTC m=+805.297694934" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.195440 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-626kh"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.196832 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.198972 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-nkc28" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.211272 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-626kh"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.216084 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rptn7"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.216889 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.219885 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.229150 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rptn7"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.243935 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6l74f"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.244759 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.309886 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-ovs-socket\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.309928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-dbus-socket\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.309952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-nmstate-lock\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.309974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwxxf\" (UniqueName: \"kubernetes.io/projected/d03e5939-1625-4597-ad3b-9edf8e8075f5-kube-api-access-bwxxf\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.310005 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drl6w\" (UniqueName: \"kubernetes.io/projected/409422f2-717f-4f82-8dae-fe01dfda7083-kube-api-access-drl6w\") pod \"nmstate-metrics-9b8c8685d-626kh\" (UID: \"409422f2-717f-4f82-8dae-fe01dfda7083\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.310043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/944b76e5-c8c5-4cba-9df2-9e9b87b540a8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rptn7\" (UID: \"944b76e5-c8c5-4cba-9df2-9e9b87b540a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.310063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfmp7\" (UniqueName: \"kubernetes.io/projected/944b76e5-c8c5-4cba-9df2-9e9b87b540a8-kube-api-access-lfmp7\") pod \"nmstate-webhook-5f558f5558-rptn7\" (UID: \"944b76e5-c8c5-4cba-9df2-9e9b87b540a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.344182 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.344890 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.347780 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-nsznh" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.348019 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.348176 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.355698 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411394 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97vw\" (UniqueName: \"kubernetes.io/projected/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-kube-api-access-d97vw\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411454 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-ovs-socket\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-dbus-socket\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411609 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-nmstate-lock\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411643 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwxxf\" (UniqueName: \"kubernetes.io/projected/d03e5939-1625-4597-ad3b-9edf8e8075f5-kube-api-access-bwxxf\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drl6w\" (UniqueName: \"kubernetes.io/projected/409422f2-717f-4f82-8dae-fe01dfda7083-kube-api-access-drl6w\") pod \"nmstate-metrics-9b8c8685d-626kh\" (UID: \"409422f2-717f-4f82-8dae-fe01dfda7083\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfmp7\" (UniqueName: \"kubernetes.io/projected/944b76e5-c8c5-4cba-9df2-9e9b87b540a8-kube-api-access-lfmp7\") pod \"nmstate-webhook-5f558f5558-rptn7\" (UID: \"944b76e5-c8c5-4cba-9df2-9e9b87b540a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.411757 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/944b76e5-c8c5-4cba-9df2-9e9b87b540a8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rptn7\" (UID: \"944b76e5-c8c5-4cba-9df2-9e9b87b540a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.412084 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-nmstate-lock\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.412143 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-ovs-socket\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.412201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d03e5939-1625-4597-ad3b-9edf8e8075f5-dbus-socket\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.420793 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/944b76e5-c8c5-4cba-9df2-9e9b87b540a8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rptn7\" (UID: \"944b76e5-c8c5-4cba-9df2-9e9b87b540a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.430664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drl6w\" (UniqueName: \"kubernetes.io/projected/409422f2-717f-4f82-8dae-fe01dfda7083-kube-api-access-drl6w\") pod \"nmstate-metrics-9b8c8685d-626kh\" (UID: \"409422f2-717f-4f82-8dae-fe01dfda7083\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.433356 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfmp7\" (UniqueName: \"kubernetes.io/projected/944b76e5-c8c5-4cba-9df2-9e9b87b540a8-kube-api-access-lfmp7\") pod \"nmstate-webhook-5f558f5558-rptn7\" (UID: \"944b76e5-c8c5-4cba-9df2-9e9b87b540a8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.435391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwxxf\" (UniqueName: \"kubernetes.io/projected/d03e5939-1625-4597-ad3b-9edf8e8075f5-kube-api-access-bwxxf\") pod \"nmstate-handler-6l74f\" (UID: \"d03e5939-1625-4597-ad3b-9edf8e8075f5\") " pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.513214 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.513297 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.513351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97vw\" (UniqueName: \"kubernetes.io/projected/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-kube-api-access-d97vw\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: E0321 05:00:53.513732 4775 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 21 05:00:53 crc kubenswrapper[4775]: E0321 05:00:53.513808 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-plugin-serving-cert podName:b0fbab95-1c88-40a3-8ccb-58bca74c8f3c nodeName:}" failed. No retries permitted until 2026-03-21 05:00:54.013788572 +0000 UTC m=+806.990252196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-8x7x2" (UID: "b0fbab95-1c88-40a3-8ccb-58bca74c8f3c") : secret "plugin-serving-cert" not found Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.514077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.522725 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.530415 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f5cb9cdf8-7rs8l"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.531353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.535688 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.535980 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97vw\" (UniqueName: \"kubernetes.io/projected/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-kube-api-access-d97vw\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.554608 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f5cb9cdf8-7rs8l"] Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.567990 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.613903 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-service-ca\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.613943 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-oauth-config\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.613971 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-oauth-serving-cert\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.621552 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-trusted-ca-bundle\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.621675 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6ns\" (UniqueName: \"kubernetes.io/projected/552a8ec9-0977-4124-8cec-00e0b85fbc8d-kube-api-access-lx6ns\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.621755 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-serving-cert\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.621786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-config\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.723489 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-service-ca\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.723571 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-oauth-config\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.723728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-oauth-serving-cert\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.723794 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-trusted-ca-bundle\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.723851 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6ns\" (UniqueName: \"kubernetes.io/projected/552a8ec9-0977-4124-8cec-00e0b85fbc8d-kube-api-access-lx6ns\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.723904 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-serving-cert\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.723926 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-config\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.724587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-service-ca\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.724615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-oauth-serving-cert\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.724977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-config\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.725098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552a8ec9-0977-4124-8cec-00e0b85fbc8d-trusted-ca-bundle\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.729676 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-oauth-config\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.729804 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552a8ec9-0977-4124-8cec-00e0b85fbc8d-console-serving-cert\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.739599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6ns\" (UniqueName: \"kubernetes.io/projected/552a8ec9-0977-4124-8cec-00e0b85fbc8d-kube-api-access-lx6ns\") pod \"console-6f5cb9cdf8-7rs8l\" (UID: \"552a8ec9-0977-4124-8cec-00e0b85fbc8d\") " pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.768151 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-626kh"] Mar 21 05:00:53 crc kubenswrapper[4775]: W0321 05:00:53.768309 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod409422f2_717f_4f82_8dae_fe01dfda7083.slice/crio-1d3267450edd9496263daf21110ec6f9a451661a7cefefc5ae4e58832a5cb02b WatchSource:0}: Error finding container 1d3267450edd9496263daf21110ec6f9a451661a7cefefc5ae4e58832a5cb02b: Status 404 returned error can't find the container with id 1d3267450edd9496263daf21110ec6f9a451661a7cefefc5ae4e58832a5cb02b Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.809794 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rptn7"] Mar 21 05:00:53 crc kubenswrapper[4775]: W0321 05:00:53.810173 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944b76e5_c8c5_4cba_9df2_9e9b87b540a8.slice/crio-90555b21c6e51115e21138f33eb614f4dd3c6518bac50d302dd41c2851a3570e WatchSource:0}: Error finding container 90555b21c6e51115e21138f33eb614f4dd3c6518bac50d302dd41c2851a3570e: Status 404 returned error can't find the container with id 90555b21c6e51115e21138f33eb614f4dd3c6518bac50d302dd41c2851a3570e Mar 21 05:00:53 crc kubenswrapper[4775]: I0321 05:00:53.908173 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.031244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.036017 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fbab95-1c88-40a3-8ccb-58bca74c8f3c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-8x7x2\" (UID: \"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.110752 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f5cb9cdf8-7rs8l"] Mar 21 05:00:54 crc kubenswrapper[4775]: W0321 05:00:54.116239 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552a8ec9_0977_4124_8cec_00e0b85fbc8d.slice/crio-fc9d44e29491eb027572cbe53cb1eff5ccf0b664f29695d8334d930979423a33 WatchSource:0}: Error finding container fc9d44e29491eb027572cbe53cb1eff5ccf0b664f29695d8334d930979423a33: Status 404 returned error can't find the container with id fc9d44e29491eb027572cbe53cb1eff5ccf0b664f29695d8334d930979423a33 Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.260750 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.317886 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" event={"ID":"409422f2-717f-4f82-8dae-fe01dfda7083","Type":"ContainerStarted","Data":"1d3267450edd9496263daf21110ec6f9a451661a7cefefc5ae4e58832a5cb02b"} Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.320516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f5cb9cdf8-7rs8l" event={"ID":"552a8ec9-0977-4124-8cec-00e0b85fbc8d","Type":"ContainerStarted","Data":"3e7a82b84b89b9c7236ebc322b6f2a7fec36ac042a44ffdc020c14e50603242d"} Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.320736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f5cb9cdf8-7rs8l" event={"ID":"552a8ec9-0977-4124-8cec-00e0b85fbc8d","Type":"ContainerStarted","Data":"fc9d44e29491eb027572cbe53cb1eff5ccf0b664f29695d8334d930979423a33"} Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.322889 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6l74f" event={"ID":"d03e5939-1625-4597-ad3b-9edf8e8075f5","Type":"ContainerStarted","Data":"15538dc9766f3a74b3af18a8ff5b6cbca13dbed2cdd76c6e407ab128491aef08"} Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.326289 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" event={"ID":"944b76e5-c8c5-4cba-9df2-9e9b87b540a8","Type":"ContainerStarted","Data":"90555b21c6e51115e21138f33eb614f4dd3c6518bac50d302dd41c2851a3570e"} Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.341388 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f5cb9cdf8-7rs8l" podStartSLOduration=1.341368592 podStartE2EDuration="1.341368592s" podCreationTimestamp="2026-03-21 05:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:00:54.340694023 +0000 UTC m=+807.317157647" watchObservedRunningTime="2026-03-21 05:00:54.341368592 +0000 UTC m=+807.317832206" Mar 21 05:00:54 crc kubenswrapper[4775]: I0321 05:00:54.538597 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2"] Mar 21 05:00:54 crc kubenswrapper[4775]: W0321 05:00:54.546715 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0fbab95_1c88_40a3_8ccb_58bca74c8f3c.slice/crio-a023ff7a200326a4a00013d768ba6028b20e32eb59ada071ba4db2fdde9bf6fb WatchSource:0}: Error finding container a023ff7a200326a4a00013d768ba6028b20e32eb59ada071ba4db2fdde9bf6fb: Status 404 returned error can't find the container with id a023ff7a200326a4a00013d768ba6028b20e32eb59ada071ba4db2fdde9bf6fb Mar 21 05:00:55 crc kubenswrapper[4775]: I0321 05:00:55.332597 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" event={"ID":"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c","Type":"ContainerStarted","Data":"a023ff7a200326a4a00013d768ba6028b20e32eb59ada071ba4db2fdde9bf6fb"} Mar 21 05:00:57 crc kubenswrapper[4775]: I0321 05:00:57.349248 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6l74f" event={"ID":"d03e5939-1625-4597-ad3b-9edf8e8075f5","Type":"ContainerStarted","Data":"c8f8b3621d8aebe8542eaca9c486a0394dea6add6674b43c81c01465cd4477be"} Mar 21 05:00:57 crc kubenswrapper[4775]: I0321 05:00:57.350258 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:00:57 crc kubenswrapper[4775]: I0321 05:00:57.351771 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" event={"ID":"944b76e5-c8c5-4cba-9df2-9e9b87b540a8","Type":"ContainerStarted","Data":"0b510fbd54cb404ec5225713a1e467cb92b4867514763b60e365ae51a7844e1d"} Mar 21 05:00:57 crc kubenswrapper[4775]: I0321 05:00:57.352398 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:00:57 crc kubenswrapper[4775]: I0321 05:00:57.355045 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" event={"ID":"409422f2-717f-4f82-8dae-fe01dfda7083","Type":"ContainerStarted","Data":"5db6f5f91bd6657bafe597321f3623e2d7d3343cf1edbf44c8c79b9357dfb233"} Mar 21 05:00:57 crc kubenswrapper[4775]: I0321 05:00:57.369990 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6l74f" podStartSLOduration=1.642006135 podStartE2EDuration="4.369961092s" podCreationTimestamp="2026-03-21 05:00:53 +0000 UTC" firstStartedPulling="2026-03-21 05:00:53.622208035 +0000 UTC m=+806.598671659" lastFinishedPulling="2026-03-21 05:00:56.350162952 +0000 UTC m=+809.326626616" observedRunningTime="2026-03-21 05:00:57.365915768 +0000 UTC m=+810.342379422" watchObservedRunningTime="2026-03-21 05:00:57.369961092 +0000 UTC m=+810.346424716" Mar 21 05:00:57 crc kubenswrapper[4775]: I0321 05:00:57.390741 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" podStartSLOduration=1.8983759980000001 podStartE2EDuration="4.390723199s" podCreationTimestamp="2026-03-21 05:00:53 +0000 UTC" firstStartedPulling="2026-03-21 05:00:53.811724989 +0000 UTC m=+806.788188613" lastFinishedPulling="2026-03-21 05:00:56.30407219 +0000 UTC m=+809.280535814" observedRunningTime="2026-03-21 05:00:57.388792335 +0000 UTC m=+810.365255959" watchObservedRunningTime="2026-03-21 05:00:57.390723199 +0000 UTC m=+810.367186843" Mar 21 05:00:58 crc kubenswrapper[4775]: I0321 05:00:58.364112 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" event={"ID":"b0fbab95-1c88-40a3-8ccb-58bca74c8f3c","Type":"ContainerStarted","Data":"479c50c0568fe490b9c0c88db80fe55cd43a3d41946da6328955d3ae544b6912"} Mar 21 05:00:58 crc kubenswrapper[4775]: I0321 05:00:58.390014 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-8x7x2" podStartSLOduration=2.665570981 podStartE2EDuration="5.389995839s" podCreationTimestamp="2026-03-21 05:00:53 +0000 UTC" firstStartedPulling="2026-03-21 05:00:54.548624697 +0000 UTC m=+807.525088321" lastFinishedPulling="2026-03-21 05:00:57.273049555 +0000 UTC m=+810.249513179" observedRunningTime="2026-03-21 05:00:58.378430872 +0000 UTC m=+811.354894516" watchObservedRunningTime="2026-03-21 05:00:58.389995839 +0000 UTC m=+811.366459463" Mar 21 05:00:59 crc kubenswrapper[4775]: I0321 05:00:59.372078 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" event={"ID":"409422f2-717f-4f82-8dae-fe01dfda7083","Type":"ContainerStarted","Data":"513ac8f75e5d22ae65ecdd4bda0fd91706a6f342c292c674620878f6890dc19d"} Mar 21 05:00:59 crc kubenswrapper[4775]: I0321 05:00:59.401743 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-626kh" podStartSLOduration=1.586026514 podStartE2EDuration="6.401720541s" podCreationTimestamp="2026-03-21 05:00:53 +0000 UTC" firstStartedPulling="2026-03-21 05:00:53.77070197 +0000 UTC m=+806.747165604" lastFinishedPulling="2026-03-21 05:00:58.586396017 +0000 UTC m=+811.562859631" observedRunningTime="2026-03-21 05:00:59.399162239 +0000 UTC m=+812.375625883" watchObservedRunningTime="2026-03-21 05:00:59.401720541 +0000 UTC m=+812.378184185" Mar 21 05:01:02 crc kubenswrapper[4775]: I0321 05:01:02.482378 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:01:02 crc kubenswrapper[4775]: I0321 05:01:02.483040 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:01:03 crc kubenswrapper[4775]: I0321 05:01:03.599817 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6l74f" Mar 21 05:01:03 crc kubenswrapper[4775]: I0321 05:01:03.908825 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:01:03 crc kubenswrapper[4775]: I0321 05:01:03.908933 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:01:03 crc kubenswrapper[4775]: I0321 05:01:03.916248 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:01:04 crc kubenswrapper[4775]: I0321 05:01:04.424081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f5cb9cdf8-7rs8l" Mar 21 05:01:04 crc kubenswrapper[4775]: I0321 05:01:04.523797 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wst2s"] Mar 21 05:01:13 crc kubenswrapper[4775]: I0321 05:01:13.542634 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rptn7" Mar 21 05:01:20 crc kubenswrapper[4775]: I0321 05:01:20.718827 4775 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.385111 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s"] Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.387827 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.389356 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.396525 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s"] Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.515595 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.515661 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqtp7\" (UniqueName: \"kubernetes.io/projected/6252c43a-d149-46f7-ac6c-263c34980fe2-kube-api-access-cqtp7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.515692 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.617475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.617890 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqtp7\" (UniqueName: \"kubernetes.io/projected/6252c43a-d149-46f7-ac6c-263c34980fe2-kube-api-access-cqtp7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.617938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.618744 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.619033 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.648006 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqtp7\" (UniqueName: \"kubernetes.io/projected/6252c43a-d149-46f7-ac6c-263c34980fe2-kube-api-access-cqtp7\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.702959 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:25 crc kubenswrapper[4775]: I0321 05:01:25.925720 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s"] Mar 21 05:01:26 crc kubenswrapper[4775]: I0321 05:01:26.579426 4775 generic.go:334] "Generic (PLEG): container finished" podID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerID="d003d37849ef559a0570ea5750927b9875f1a10ae825ecea7f42bb3392bd2ce8" exitCode=0 Mar 21 05:01:26 crc kubenswrapper[4775]: I0321 05:01:26.579503 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" event={"ID":"6252c43a-d149-46f7-ac6c-263c34980fe2","Type":"ContainerDied","Data":"d003d37849ef559a0570ea5750927b9875f1a10ae825ecea7f42bb3392bd2ce8"} Mar 21 05:01:26 crc kubenswrapper[4775]: I0321 05:01:26.579544 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" event={"ID":"6252c43a-d149-46f7-ac6c-263c34980fe2","Type":"ContainerStarted","Data":"9848d056d20a28e5d15c2a52eb1b9b4a66212a967ad7a7690bd7227caa27e415"} Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.505914 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5l6s"] Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.507780 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.521240 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5l6s"] Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.644457 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ww7q\" (UniqueName: \"kubernetes.io/projected/e3a4716c-b33b-4030-a4fa-0feda18a8657-kube-api-access-5ww7q\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.644537 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-utilities\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.644626 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-catalog-content\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.746257 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-catalog-content\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.746315 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ww7q\" (UniqueName: \"kubernetes.io/projected/e3a4716c-b33b-4030-a4fa-0feda18a8657-kube-api-access-5ww7q\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.746389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-utilities\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.746881 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-catalog-content\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.746997 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-utilities\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.770091 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ww7q\" (UniqueName: \"kubernetes.io/projected/e3a4716c-b33b-4030-a4fa-0feda18a8657-kube-api-access-5ww7q\") pod \"redhat-operators-w5l6s\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:27 crc kubenswrapper[4775]: I0321 05:01:27.827397 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:28 crc kubenswrapper[4775]: I0321 05:01:28.056530 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5l6s"] Mar 21 05:01:28 crc kubenswrapper[4775]: I0321 05:01:28.593666 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerID="2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a" exitCode=0 Mar 21 05:01:28 crc kubenswrapper[4775]: I0321 05:01:28.593718 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5l6s" event={"ID":"e3a4716c-b33b-4030-a4fa-0feda18a8657","Type":"ContainerDied","Data":"2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a"} Mar 21 05:01:28 crc kubenswrapper[4775]: I0321 05:01:28.593936 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5l6s" event={"ID":"e3a4716c-b33b-4030-a4fa-0feda18a8657","Type":"ContainerStarted","Data":"e0c863098fc3ee0671df4688c9648d37c09a40f58d77f06335e48b368ac9c588"} Mar 21 05:01:29 crc kubenswrapper[4775]: I0321 05:01:29.561630 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wst2s" podUID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" containerName="console" containerID="cri-o://4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764" gracePeriod=15 Mar 21 05:01:29 crc kubenswrapper[4775]: I0321 05:01:29.602624 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5l6s" event={"ID":"e3a4716c-b33b-4030-a4fa-0feda18a8657","Type":"ContainerStarted","Data":"bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b"} Mar 21 05:01:29 crc kubenswrapper[4775]: I0321 05:01:29.605813 4775 generic.go:334] "Generic (PLEG): container finished" podID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerID="2af57a8d89d834477d936f1accea68294ed6669b9cde85ec3a277a286c70ebc0" exitCode=0 Mar 21 05:01:29 crc kubenswrapper[4775]: I0321 05:01:29.605847 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" event={"ID":"6252c43a-d149-46f7-ac6c-263c34980fe2","Type":"ContainerDied","Data":"2af57a8d89d834477d936f1accea68294ed6669b9cde85ec3a277a286c70ebc0"} Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.055503 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wst2s_a9ed1e0e-eff9-4690-bcf5-45f6074c200e/console/0.log" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.055568 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.184618 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m6fm\" (UniqueName: \"kubernetes.io/projected/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-kube-api-access-8m6fm\") pod \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.184680 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-service-ca\") pod \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.184740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-oauth-serving-cert\") pod \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.184771 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-oauth-config\") pod \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.184823 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-serving-cert\") pod \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.184882 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-trusted-ca-bundle\") pod \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.184905 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-config\") pod \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\" (UID: \"a9ed1e0e-eff9-4690-bcf5-45f6074c200e\") " Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.185815 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a9ed1e0e-eff9-4690-bcf5-45f6074c200e" (UID: "a9ed1e0e-eff9-4690-bcf5-45f6074c200e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.185826 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-service-ca" (OuterVolumeSpecName: "service-ca") pod "a9ed1e0e-eff9-4690-bcf5-45f6074c200e" (UID: "a9ed1e0e-eff9-4690-bcf5-45f6074c200e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.185841 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-config" (OuterVolumeSpecName: "console-config") pod "a9ed1e0e-eff9-4690-bcf5-45f6074c200e" (UID: "a9ed1e0e-eff9-4690-bcf5-45f6074c200e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.186206 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a9ed1e0e-eff9-4690-bcf5-45f6074c200e" (UID: "a9ed1e0e-eff9-4690-bcf5-45f6074c200e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.191356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-kube-api-access-8m6fm" (OuterVolumeSpecName: "kube-api-access-8m6fm") pod "a9ed1e0e-eff9-4690-bcf5-45f6074c200e" (UID: "a9ed1e0e-eff9-4690-bcf5-45f6074c200e"). InnerVolumeSpecName "kube-api-access-8m6fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.191524 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a9ed1e0e-eff9-4690-bcf5-45f6074c200e" (UID: "a9ed1e0e-eff9-4690-bcf5-45f6074c200e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.196515 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a9ed1e0e-eff9-4690-bcf5-45f6074c200e" (UID: "a9ed1e0e-eff9-4690-bcf5-45f6074c200e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.287471 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.287565 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.287587 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.287604 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.287622 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.287638 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m6fm\" (UniqueName: \"kubernetes.io/projected/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-kube-api-access-8m6fm\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.287658 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9ed1e0e-eff9-4690-bcf5-45f6074c200e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.618772 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerID="bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b" exitCode=0 Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.618847 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5l6s" event={"ID":"e3a4716c-b33b-4030-a4fa-0feda18a8657","Type":"ContainerDied","Data":"bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b"} Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.621709 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wst2s_a9ed1e0e-eff9-4690-bcf5-45f6074c200e/console/0.log" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.621740 4775 generic.go:334] "Generic (PLEG): container finished" podID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" containerID="4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764" exitCode=2 Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.621777 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wst2s" event={"ID":"a9ed1e0e-eff9-4690-bcf5-45f6074c200e","Type":"ContainerDied","Data":"4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764"} Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.621793 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wst2s" event={"ID":"a9ed1e0e-eff9-4690-bcf5-45f6074c200e","Type":"ContainerDied","Data":"c582e2fab200954a9843b0514b416a8988bf80f007355b323c81a87c8930c6fa"} Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.621809 4775 scope.go:117] "RemoveContainer" containerID="4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.621879 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wst2s" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.627075 4775 generic.go:334] "Generic (PLEG): container finished" podID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerID="a8c0ef7d7f4885716379df92240c87e2eb767b281e561cce6a2be3b2c742f219" exitCode=0 Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.627247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" event={"ID":"6252c43a-d149-46f7-ac6c-263c34980fe2","Type":"ContainerDied","Data":"a8c0ef7d7f4885716379df92240c87e2eb767b281e561cce6a2be3b2c742f219"} Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.641435 4775 scope.go:117] "RemoveContainer" containerID="4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764" Mar 21 05:01:30 crc kubenswrapper[4775]: E0321 05:01:30.641949 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764\": container with ID starting with 4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764 not found: ID does not exist" containerID="4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.642026 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764"} err="failed to get container status \"4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764\": rpc error: code = NotFound desc = could not find container \"4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764\": container with ID starting with 4843fbb3995c43ae7854ec367e570829c36ae201e54901d87c5c625dcdce3764 not found: ID does not exist" Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.687604 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wst2s"] Mar 21 05:01:30 crc kubenswrapper[4775]: I0321 05:01:30.693231 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wst2s"] Mar 21 05:01:31 crc kubenswrapper[4775]: I0321 05:01:31.635605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5l6s" event={"ID":"e3a4716c-b33b-4030-a4fa-0feda18a8657","Type":"ContainerStarted","Data":"66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd"} Mar 21 05:01:31 crc kubenswrapper[4775]: I0321 05:01:31.663870 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5l6s" podStartSLOduration=1.943428644 podStartE2EDuration="4.663843129s" podCreationTimestamp="2026-03-21 05:01:27 +0000 UTC" firstStartedPulling="2026-03-21 05:01:28.595073263 +0000 UTC m=+841.571536887" lastFinishedPulling="2026-03-21 05:01:31.315487738 +0000 UTC m=+844.291951372" observedRunningTime="2026-03-21 05:01:31.657472319 +0000 UTC m=+844.633935943" watchObservedRunningTime="2026-03-21 05:01:31.663843129 +0000 UTC m=+844.640306773" Mar 21 05:01:31 crc kubenswrapper[4775]: I0321 05:01:31.676412 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" path="/var/lib/kubelet/pods/a9ed1e0e-eff9-4690-bcf5-45f6074c200e/volumes" Mar 21 05:01:31 crc kubenswrapper[4775]: I0321 05:01:31.850647 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.006762 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-util\") pod \"6252c43a-d149-46f7-ac6c-263c34980fe2\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.006833 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqtp7\" (UniqueName: \"kubernetes.io/projected/6252c43a-d149-46f7-ac6c-263c34980fe2-kube-api-access-cqtp7\") pod \"6252c43a-d149-46f7-ac6c-263c34980fe2\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.006876 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-bundle\") pod \"6252c43a-d149-46f7-ac6c-263c34980fe2\" (UID: \"6252c43a-d149-46f7-ac6c-263c34980fe2\") " Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.009405 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-bundle" (OuterVolumeSpecName: "bundle") pod "6252c43a-d149-46f7-ac6c-263c34980fe2" (UID: "6252c43a-d149-46f7-ac6c-263c34980fe2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.012666 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252c43a-d149-46f7-ac6c-263c34980fe2-kube-api-access-cqtp7" (OuterVolumeSpecName: "kube-api-access-cqtp7") pod "6252c43a-d149-46f7-ac6c-263c34980fe2" (UID: "6252c43a-d149-46f7-ac6c-263c34980fe2"). InnerVolumeSpecName "kube-api-access-cqtp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.027092 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-util" (OuterVolumeSpecName: "util") pod "6252c43a-d149-46f7-ac6c-263c34980fe2" (UID: "6252c43a-d149-46f7-ac6c-263c34980fe2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.108301 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-util\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.108357 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqtp7\" (UniqueName: \"kubernetes.io/projected/6252c43a-d149-46f7-ac6c-263c34980fe2-kube-api-access-cqtp7\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.108373 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6252c43a-d149-46f7-ac6c-263c34980fe2-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.482224 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.482287 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.482335 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.483008 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eaf646a6237d4b4ce6a8a82755505f01a336cf93fef407f02dbf82d68f5008b4"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.483091 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://eaf646a6237d4b4ce6a8a82755505f01a336cf93fef407f02dbf82d68f5008b4" gracePeriod=600 Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.658630 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="eaf646a6237d4b4ce6a8a82755505f01a336cf93fef407f02dbf82d68f5008b4" exitCode=0 Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.658741 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"eaf646a6237d4b4ce6a8a82755505f01a336cf93fef407f02dbf82d68f5008b4"} Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.658800 4775 scope.go:117] "RemoveContainer" containerID="948421a752c90ac0f2fbc508e5358894c1d0ccf82922efe510f0505bdc2c2715" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.662900 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.663278 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s" event={"ID":"6252c43a-d149-46f7-ac6c-263c34980fe2","Type":"ContainerDied","Data":"9848d056d20a28e5d15c2a52eb1b9b4a66212a967ad7a7690bd7227caa27e415"} Mar 21 05:01:32 crc kubenswrapper[4775]: I0321 05:01:32.663325 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9848d056d20a28e5d15c2a52eb1b9b4a66212a967ad7a7690bd7227caa27e415" Mar 21 05:01:33 crc kubenswrapper[4775]: I0321 05:01:33.694024 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"8d303e1558b2adccb16582693a30248fb7f96ca561d7dcb3104197e825dd15a7"} Mar 21 05:01:37 crc kubenswrapper[4775]: I0321 05:01:37.827841 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:37 crc kubenswrapper[4775]: I0321 05:01:37.828519 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:39 crc kubenswrapper[4775]: I0321 05:01:39.065857 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5l6s" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="registry-server" probeResult="failure" output=< Mar 21 05:01:39 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:01:39 crc kubenswrapper[4775]: > Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.570500 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn"] Mar 21 05:01:43 crc kubenswrapper[4775]: E0321 05:01:43.571305 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerName="util" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.571323 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerName="util" Mar 21 05:01:43 crc kubenswrapper[4775]: E0321 05:01:43.571347 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" containerName="console" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.571354 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" containerName="console" Mar 21 05:01:43 crc kubenswrapper[4775]: E0321 05:01:43.571368 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerName="pull" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.571376 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerName="pull" Mar 21 05:01:43 crc kubenswrapper[4775]: E0321 05:01:43.571414 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerName="extract" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.571421 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerName="extract" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.571543 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ed1e0e-eff9-4690-bcf5-45f6074c200e" containerName="console" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.571567 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6252c43a-d149-46f7-ac6c-263c34980fe2" containerName="extract" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.572021 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.576529 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.576841 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.577260 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.579322 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.581162 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-msjcl" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.612309 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn"] Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.619425 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38d80f78-fa33-49b7-99c2-62d50d1c011b-apiservice-cert\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.619500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38d80f78-fa33-49b7-99c2-62d50d1c011b-webhook-cert\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.619540 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb947\" (UniqueName: \"kubernetes.io/projected/38d80f78-fa33-49b7-99c2-62d50d1c011b-kube-api-access-hb947\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.726665 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38d80f78-fa33-49b7-99c2-62d50d1c011b-webhook-cert\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.726723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb947\" (UniqueName: \"kubernetes.io/projected/38d80f78-fa33-49b7-99c2-62d50d1c011b-kube-api-access-hb947\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.726862 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38d80f78-fa33-49b7-99c2-62d50d1c011b-apiservice-cert\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.736821 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/38d80f78-fa33-49b7-99c2-62d50d1c011b-apiservice-cert\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.738052 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/38d80f78-fa33-49b7-99c2-62d50d1c011b-webhook-cert\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.795805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb947\" (UniqueName: \"kubernetes.io/projected/38d80f78-fa33-49b7-99c2-62d50d1c011b-kube-api-access-hb947\") pod \"metallb-operator-controller-manager-6695f56dbb-f6tqn\" (UID: \"38d80f78-fa33-49b7-99c2-62d50d1c011b\") " pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:43 crc kubenswrapper[4775]: I0321 05:01:43.893590 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.026564 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn"] Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.027273 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.031159 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.031480 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-84rwk" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.034980 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.052883 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn"] Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.133018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jrhz\" (UniqueName: \"kubernetes.io/projected/8c52832d-1aee-4eac-b625-24110b985402-kube-api-access-4jrhz\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.133439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c52832d-1aee-4eac-b625-24110b985402-apiservice-cert\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.133481 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c52832d-1aee-4eac-b625-24110b985402-webhook-cert\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.234232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jrhz\" (UniqueName: \"kubernetes.io/projected/8c52832d-1aee-4eac-b625-24110b985402-kube-api-access-4jrhz\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.234275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c52832d-1aee-4eac-b625-24110b985402-apiservice-cert\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.234312 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c52832d-1aee-4eac-b625-24110b985402-webhook-cert\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.240826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c52832d-1aee-4eac-b625-24110b985402-webhook-cert\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.241869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c52832d-1aee-4eac-b625-24110b985402-apiservice-cert\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.255946 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jrhz\" (UniqueName: \"kubernetes.io/projected/8c52832d-1aee-4eac-b625-24110b985402-kube-api-access-4jrhz\") pod \"metallb-operator-webhook-server-559bfcf5c-qqsvn\" (UID: \"8c52832d-1aee-4eac-b625-24110b985402\") " pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.351885 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.356639 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn"] Mar 21 05:01:44 crc kubenswrapper[4775]: W0321 05:01:44.387186 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d80f78_fa33_49b7_99c2_62d50d1c011b.slice/crio-fdd55ebd093a8f7c065a1d3ab318dd78e829212ce1f2638f9ee9aabd5cffde3f WatchSource:0}: Error finding container fdd55ebd093a8f7c065a1d3ab318dd78e829212ce1f2638f9ee9aabd5cffde3f: Status 404 returned error can't find the container with id fdd55ebd093a8f7c065a1d3ab318dd78e829212ce1f2638f9ee9aabd5cffde3f Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.565950 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn"] Mar 21 05:01:44 crc kubenswrapper[4775]: W0321 05:01:44.577328 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c52832d_1aee_4eac_b625_24110b985402.slice/crio-c9a74335940a7c570ab8a8fdf97fcea4e88cf8d6d72777f0d31424d8d6bebd67 WatchSource:0}: Error finding container c9a74335940a7c570ab8a8fdf97fcea4e88cf8d6d72777f0d31424d8d6bebd67: Status 404 returned error can't find the container with id c9a74335940a7c570ab8a8fdf97fcea4e88cf8d6d72777f0d31424d8d6bebd67 Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.766083 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" event={"ID":"38d80f78-fa33-49b7-99c2-62d50d1c011b","Type":"ContainerStarted","Data":"fdd55ebd093a8f7c065a1d3ab318dd78e829212ce1f2638f9ee9aabd5cffde3f"} Mar 21 05:01:44 crc kubenswrapper[4775]: I0321 05:01:44.768622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" event={"ID":"8c52832d-1aee-4eac-b625-24110b985402","Type":"ContainerStarted","Data":"c9a74335940a7c570ab8a8fdf97fcea4e88cf8d6d72777f0d31424d8d6bebd67"} Mar 21 05:01:47 crc kubenswrapper[4775]: I0321 05:01:47.879796 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:47 crc kubenswrapper[4775]: I0321 05:01:47.928983 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:48 crc kubenswrapper[4775]: I0321 05:01:48.690856 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5l6s"] Mar 21 05:01:49 crc kubenswrapper[4775]: I0321 05:01:49.820582 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5l6s" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="registry-server" containerID="cri-o://66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd" gracePeriod=2 Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.475415 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.562702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ww7q\" (UniqueName: \"kubernetes.io/projected/e3a4716c-b33b-4030-a4fa-0feda18a8657-kube-api-access-5ww7q\") pod \"e3a4716c-b33b-4030-a4fa-0feda18a8657\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.562789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-catalog-content\") pod \"e3a4716c-b33b-4030-a4fa-0feda18a8657\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.562820 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-utilities\") pod \"e3a4716c-b33b-4030-a4fa-0feda18a8657\" (UID: \"e3a4716c-b33b-4030-a4fa-0feda18a8657\") " Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.563945 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-utilities" (OuterVolumeSpecName: "utilities") pod "e3a4716c-b33b-4030-a4fa-0feda18a8657" (UID: "e3a4716c-b33b-4030-a4fa-0feda18a8657"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.573392 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a4716c-b33b-4030-a4fa-0feda18a8657-kube-api-access-5ww7q" (OuterVolumeSpecName: "kube-api-access-5ww7q") pod "e3a4716c-b33b-4030-a4fa-0feda18a8657" (UID: "e3a4716c-b33b-4030-a4fa-0feda18a8657"). InnerVolumeSpecName "kube-api-access-5ww7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.664243 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ww7q\" (UniqueName: \"kubernetes.io/projected/e3a4716c-b33b-4030-a4fa-0feda18a8657-kube-api-access-5ww7q\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.664576 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.709863 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3a4716c-b33b-4030-a4fa-0feda18a8657" (UID: "e3a4716c-b33b-4030-a4fa-0feda18a8657"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.766183 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a4716c-b33b-4030-a4fa-0feda18a8657-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.827025 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerID="66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd" exitCode=0 Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.827081 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5l6s" event={"ID":"e3a4716c-b33b-4030-a4fa-0feda18a8657","Type":"ContainerDied","Data":"66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd"} Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.827110 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5l6s" event={"ID":"e3a4716c-b33b-4030-a4fa-0feda18a8657","Type":"ContainerDied","Data":"e0c863098fc3ee0671df4688c9648d37c09a40f58d77f06335e48b368ac9c588"} Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.827160 4775 scope.go:117] "RemoveContainer" containerID="66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.827301 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5l6s" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.834346 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" event={"ID":"38d80f78-fa33-49b7-99c2-62d50d1c011b","Type":"ContainerStarted","Data":"0828245983a7e49a9e117cc12dfaf0365eeb35c9e1e14cb85941fcf0da8e81b5"} Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.834962 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.836750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" event={"ID":"8c52832d-1aee-4eac-b625-24110b985402","Type":"ContainerStarted","Data":"22e178e2bce37747bac8ef46e8573736db1a74306e9f3e342f7fee33b4fe30f2"} Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.837217 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.855869 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" podStartSLOduration=2.207598325 podStartE2EDuration="7.855847893s" podCreationTimestamp="2026-03-21 05:01:43 +0000 UTC" firstStartedPulling="2026-03-21 05:01:44.391714415 +0000 UTC m=+857.368178039" lastFinishedPulling="2026-03-21 05:01:50.039963983 +0000 UTC m=+863.016427607" observedRunningTime="2026-03-21 05:01:50.853221689 +0000 UTC m=+863.829685313" watchObservedRunningTime="2026-03-21 05:01:50.855847893 +0000 UTC m=+863.832311517" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.858697 4775 scope.go:117] "RemoveContainer" containerID="bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.876854 4775 scope.go:117] "RemoveContainer" containerID="2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.877968 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" podStartSLOduration=1.985400447 podStartE2EDuration="7.877946847s" podCreationTimestamp="2026-03-21 05:01:43 +0000 UTC" firstStartedPulling="2026-03-21 05:01:44.57877949 +0000 UTC m=+857.555243114" lastFinishedPulling="2026-03-21 05:01:50.47132589 +0000 UTC m=+863.447789514" observedRunningTime="2026-03-21 05:01:50.874943363 +0000 UTC m=+863.851406987" watchObservedRunningTime="2026-03-21 05:01:50.877946847 +0000 UTC m=+863.854410471" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.890697 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5l6s"] Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.894657 4775 scope.go:117] "RemoveContainer" containerID="66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd" Mar 21 05:01:50 crc kubenswrapper[4775]: E0321 05:01:50.895473 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd\": container with ID starting with 66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd not found: ID does not exist" containerID="66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.895524 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd"} err="failed to get container status \"66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd\": rpc error: code = NotFound desc = could not find container \"66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd\": container with ID starting with 66bada73977196eaa9ea4d56550148a1585d2a3a1a5efa78de5403ca207a8ebd not found: ID does not exist" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.895611 4775 scope.go:117] "RemoveContainer" containerID="bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b" Mar 21 05:01:50 crc kubenswrapper[4775]: E0321 05:01:50.896771 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b\": container with ID starting with bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b not found: ID does not exist" containerID="bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.896844 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b"} err="failed to get container status \"bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b\": rpc error: code = NotFound desc = could not find container \"bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b\": container with ID starting with bfefe0cac35a2f3c7e6293ed113af6198ac9e4db485c73a73295e5e65ba7c34b not found: ID does not exist" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.896881 4775 scope.go:117] "RemoveContainer" containerID="2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a" Mar 21 05:01:50 crc kubenswrapper[4775]: E0321 05:01:50.901339 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a\": container with ID starting with 2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a not found: ID does not exist" containerID="2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.901490 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a"} err="failed to get container status \"2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a\": rpc error: code = NotFound desc = could not find container \"2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a\": container with ID starting with 2adc93f631ab51a3bffbfd6d7c28133602106d191f43a08a57386749aee6506a not found: ID does not exist" Mar 21 05:01:50 crc kubenswrapper[4775]: I0321 05:01:50.904559 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5l6s"] Mar 21 05:01:51 crc kubenswrapper[4775]: I0321 05:01:51.668710 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" path="/var/lib/kubelet/pods/e3a4716c-b33b-4030-a4fa-0feda18a8657/volumes" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.153847 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567822-pqlrw"] Mar 21 05:02:00 crc kubenswrapper[4775]: E0321 05:02:00.154730 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="registry-server" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.154748 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="registry-server" Mar 21 05:02:00 crc kubenswrapper[4775]: E0321 05:02:00.154759 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="extract-utilities" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.154766 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="extract-utilities" Mar 21 05:02:00 crc kubenswrapper[4775]: E0321 05:02:00.154775 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="extract-content" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.154781 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="extract-content" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.154906 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a4716c-b33b-4030-a4fa-0feda18a8657" containerName="registry-server" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.155353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-pqlrw" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.157790 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.158384 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.158661 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.175221 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-pqlrw"] Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.297686 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcgjc\" (UniqueName: \"kubernetes.io/projected/f791f730-ffa2-44d3-b151-dc9d6e6e1743-kube-api-access-dcgjc\") pod \"auto-csr-approver-29567822-pqlrw\" (UID: \"f791f730-ffa2-44d3-b151-dc9d6e6e1743\") " pod="openshift-infra/auto-csr-approver-29567822-pqlrw" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.398985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcgjc\" (UniqueName: \"kubernetes.io/projected/f791f730-ffa2-44d3-b151-dc9d6e6e1743-kube-api-access-dcgjc\") pod \"auto-csr-approver-29567822-pqlrw\" (UID: \"f791f730-ffa2-44d3-b151-dc9d6e6e1743\") " pod="openshift-infra/auto-csr-approver-29567822-pqlrw" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.423795 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcgjc\" (UniqueName: \"kubernetes.io/projected/f791f730-ffa2-44d3-b151-dc9d6e6e1743-kube-api-access-dcgjc\") pod \"auto-csr-approver-29567822-pqlrw\" (UID: \"f791f730-ffa2-44d3-b151-dc9d6e6e1743\") " pod="openshift-infra/auto-csr-approver-29567822-pqlrw" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.472262 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-pqlrw" Mar 21 05:02:00 crc kubenswrapper[4775]: I0321 05:02:00.900555 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-pqlrw"] Mar 21 05:02:01 crc kubenswrapper[4775]: I0321 05:02:01.898770 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-pqlrw" event={"ID":"f791f730-ffa2-44d3-b151-dc9d6e6e1743","Type":"ContainerStarted","Data":"439adf561bf5728b29d86d131eb9f13bcfb89b7a3f50e301bdeb183340af5ce2"} Mar 21 05:02:02 crc kubenswrapper[4775]: I0321 05:02:02.907189 4775 generic.go:334] "Generic (PLEG): container finished" podID="f791f730-ffa2-44d3-b151-dc9d6e6e1743" containerID="a9b0f66cd072ab055301f7bd9fc0725dbbd203bf7fa9a48d07d610fb2904d30b" exitCode=0 Mar 21 05:02:02 crc kubenswrapper[4775]: I0321 05:02:02.907232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-pqlrw" event={"ID":"f791f730-ffa2-44d3-b151-dc9d6e6e1743","Type":"ContainerDied","Data":"a9b0f66cd072ab055301f7bd9fc0725dbbd203bf7fa9a48d07d610fb2904d30b"} Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.214516 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-pqlrw" Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.265598 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcgjc\" (UniqueName: \"kubernetes.io/projected/f791f730-ffa2-44d3-b151-dc9d6e6e1743-kube-api-access-dcgjc\") pod \"f791f730-ffa2-44d3-b151-dc9d6e6e1743\" (UID: \"f791f730-ffa2-44d3-b151-dc9d6e6e1743\") " Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.273294 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f791f730-ffa2-44d3-b151-dc9d6e6e1743-kube-api-access-dcgjc" (OuterVolumeSpecName: "kube-api-access-dcgjc") pod "f791f730-ffa2-44d3-b151-dc9d6e6e1743" (UID: "f791f730-ffa2-44d3-b151-dc9d6e6e1743"). InnerVolumeSpecName "kube-api-access-dcgjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.360589 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-559bfcf5c-qqsvn" Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.366680 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcgjc\" (UniqueName: \"kubernetes.io/projected/f791f730-ffa2-44d3-b151-dc9d6e6e1743-kube-api-access-dcgjc\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.919622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-pqlrw" event={"ID":"f791f730-ffa2-44d3-b151-dc9d6e6e1743","Type":"ContainerDied","Data":"439adf561bf5728b29d86d131eb9f13bcfb89b7a3f50e301bdeb183340af5ce2"} Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.919659 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="439adf561bf5728b29d86d131eb9f13bcfb89b7a3f50e301bdeb183340af5ce2" Mar 21 05:02:04 crc kubenswrapper[4775]: I0321 05:02:04.919691 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-pqlrw" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.263897 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-7wggn"] Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.276322 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-7wggn"] Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.555964 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hxvz8"] Mar 21 05:02:05 crc kubenswrapper[4775]: E0321 05:02:05.556255 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f791f730-ffa2-44d3-b151-dc9d6e6e1743" containerName="oc" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.556274 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f791f730-ffa2-44d3-b151-dc9d6e6e1743" containerName="oc" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.556394 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f791f730-ffa2-44d3-b151-dc9d6e6e1743" containerName="oc" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.557353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.566541 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxvz8"] Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.667508 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1edcd264-4604-481b-b146-1ed5a34badba" path="/var/lib/kubelet/pods/1edcd264-4604-481b-b146-1ed5a34badba/volumes" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.690322 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-catalog-content\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.690419 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dp2c\" (UniqueName: \"kubernetes.io/projected/65fae7ea-6284-4905-aa68-3251c5ef0602-kube-api-access-5dp2c\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.690477 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-utilities\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.791468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-utilities\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.791730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-catalog-content\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.791803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dp2c\" (UniqueName: \"kubernetes.io/projected/65fae7ea-6284-4905-aa68-3251c5ef0602-kube-api-access-5dp2c\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.792110 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-utilities\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.792250 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-catalog-content\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.811363 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dp2c\" (UniqueName: \"kubernetes.io/projected/65fae7ea-6284-4905-aa68-3251c5ef0602-kube-api-access-5dp2c\") pod \"certified-operators-hxvz8\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:05 crc kubenswrapper[4775]: I0321 05:02:05.874094 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:06 crc kubenswrapper[4775]: I0321 05:02:06.164633 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxvz8"] Mar 21 05:02:06 crc kubenswrapper[4775]: I0321 05:02:06.947654 4775 generic.go:334] "Generic (PLEG): container finished" podID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerID="cc6575c3206e4931e90f2d2eeedfd610fe0c60e3f48437becf4f5af661bb2558" exitCode=0 Mar 21 05:02:06 crc kubenswrapper[4775]: I0321 05:02:06.947726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxvz8" event={"ID":"65fae7ea-6284-4905-aa68-3251c5ef0602","Type":"ContainerDied","Data":"cc6575c3206e4931e90f2d2eeedfd610fe0c60e3f48437becf4f5af661bb2558"} Mar 21 05:02:06 crc kubenswrapper[4775]: I0321 05:02:06.947968 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxvz8" event={"ID":"65fae7ea-6284-4905-aa68-3251c5ef0602","Type":"ContainerStarted","Data":"b496a0cb17e25c9afc1e8fbd644c8d6a7c4698da16d39e00bebd370b5d3a0cc3"} Mar 21 05:02:08 crc kubenswrapper[4775]: I0321 05:02:08.962301 4775 generic.go:334] "Generic (PLEG): container finished" podID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerID="e6b7d92e6c0e684618dc387b9902e9548ae9e2cf09c7362448b678bb11ed2975" exitCode=0 Mar 21 05:02:08 crc kubenswrapper[4775]: I0321 05:02:08.962404 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxvz8" event={"ID":"65fae7ea-6284-4905-aa68-3251c5ef0602","Type":"ContainerDied","Data":"e6b7d92e6c0e684618dc387b9902e9548ae9e2cf09c7362448b678bb11ed2975"} Mar 21 05:02:09 crc kubenswrapper[4775]: I0321 05:02:09.970871 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxvz8" event={"ID":"65fae7ea-6284-4905-aa68-3251c5ef0602","Type":"ContainerStarted","Data":"9cec0f381408ed32105d216b30f0489513e66921ad73f2364aedcec396fd2f22"} Mar 21 05:02:09 crc kubenswrapper[4775]: I0321 05:02:09.993300 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hxvz8" podStartSLOduration=2.155120302 podStartE2EDuration="4.993282804s" podCreationTimestamp="2026-03-21 05:02:05 +0000 UTC" firstStartedPulling="2026-03-21 05:02:06.950329678 +0000 UTC m=+879.926793342" lastFinishedPulling="2026-03-21 05:02:09.78849223 +0000 UTC m=+882.764955844" observedRunningTime="2026-03-21 05:02:09.989981521 +0000 UTC m=+882.966445165" watchObservedRunningTime="2026-03-21 05:02:09.993282804 +0000 UTC m=+882.969746428" Mar 21 05:02:15 crc kubenswrapper[4775]: I0321 05:02:15.874363 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:15 crc kubenswrapper[4775]: I0321 05:02:15.875096 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:15 crc kubenswrapper[4775]: I0321 05:02:15.936732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:16 crc kubenswrapper[4775]: I0321 05:02:16.044647 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:18 crc kubenswrapper[4775]: I0321 05:02:18.346651 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxvz8"] Mar 21 05:02:19 crc kubenswrapper[4775]: I0321 05:02:19.023698 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hxvz8" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="registry-server" containerID="cri-o://9cec0f381408ed32105d216b30f0489513e66921ad73f2364aedcec396fd2f22" gracePeriod=2 Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.039144 4775 generic.go:334] "Generic (PLEG): container finished" podID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerID="9cec0f381408ed32105d216b30f0489513e66921ad73f2364aedcec396fd2f22" exitCode=0 Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.039157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxvz8" event={"ID":"65fae7ea-6284-4905-aa68-3251c5ef0602","Type":"ContainerDied","Data":"9cec0f381408ed32105d216b30f0489513e66921ad73f2364aedcec396fd2f22"} Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.474747 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.645211 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dp2c\" (UniqueName: \"kubernetes.io/projected/65fae7ea-6284-4905-aa68-3251c5ef0602-kube-api-access-5dp2c\") pod \"65fae7ea-6284-4905-aa68-3251c5ef0602\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.645294 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-catalog-content\") pod \"65fae7ea-6284-4905-aa68-3251c5ef0602\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.645328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-utilities\") pod \"65fae7ea-6284-4905-aa68-3251c5ef0602\" (UID: \"65fae7ea-6284-4905-aa68-3251c5ef0602\") " Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.646283 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-utilities" (OuterVolumeSpecName: "utilities") pod "65fae7ea-6284-4905-aa68-3251c5ef0602" (UID: "65fae7ea-6284-4905-aa68-3251c5ef0602"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.660529 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fae7ea-6284-4905-aa68-3251c5ef0602-kube-api-access-5dp2c" (OuterVolumeSpecName: "kube-api-access-5dp2c") pod "65fae7ea-6284-4905-aa68-3251c5ef0602" (UID: "65fae7ea-6284-4905-aa68-3251c5ef0602"). InnerVolumeSpecName "kube-api-access-5dp2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.695680 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65fae7ea-6284-4905-aa68-3251c5ef0602" (UID: "65fae7ea-6284-4905-aa68-3251c5ef0602"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.747537 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dp2c\" (UniqueName: \"kubernetes.io/projected/65fae7ea-6284-4905-aa68-3251c5ef0602-kube-api-access-5dp2c\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.747570 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:21 crc kubenswrapper[4775]: I0321 05:02:21.747588 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65fae7ea-6284-4905-aa68-3251c5ef0602-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:22 crc kubenswrapper[4775]: I0321 05:02:22.048541 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxvz8" event={"ID":"65fae7ea-6284-4905-aa68-3251c5ef0602","Type":"ContainerDied","Data":"b496a0cb17e25c9afc1e8fbd644c8d6a7c4698da16d39e00bebd370b5d3a0cc3"} Mar 21 05:02:22 crc kubenswrapper[4775]: I0321 05:02:22.048591 4775 scope.go:117] "RemoveContainer" containerID="9cec0f381408ed32105d216b30f0489513e66921ad73f2364aedcec396fd2f22" Mar 21 05:02:22 crc kubenswrapper[4775]: I0321 05:02:22.048604 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxvz8" Mar 21 05:02:22 crc kubenswrapper[4775]: I0321 05:02:22.068156 4775 scope.go:117] "RemoveContainer" containerID="e6b7d92e6c0e684618dc387b9902e9548ae9e2cf09c7362448b678bb11ed2975" Mar 21 05:02:22 crc kubenswrapper[4775]: I0321 05:02:22.079700 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxvz8"] Mar 21 05:02:22 crc kubenswrapper[4775]: I0321 05:02:22.086642 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hxvz8"] Mar 21 05:02:22 crc kubenswrapper[4775]: I0321 05:02:22.103649 4775 scope.go:117] "RemoveContainer" containerID="cc6575c3206e4931e90f2d2eeedfd610fe0c60e3f48437becf4f5af661bb2558" Mar 21 05:02:23 crc kubenswrapper[4775]: I0321 05:02:23.668973 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" path="/var/lib/kubelet/pods/65fae7ea-6284-4905-aa68-3251c5ef0602/volumes" Mar 21 05:02:23 crc kubenswrapper[4775]: I0321 05:02:23.896799 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6695f56dbb-f6tqn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.548902 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hmmgn"] Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.549201 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="extract-utilities" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.549221 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="extract-utilities" Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.549236 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="registry-server" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.549245 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="registry-server" Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.549267 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="extract-content" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.549283 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="extract-content" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.549468 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fae7ea-6284-4905-aa68-3251c5ef0602" containerName="registry-server" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.552383 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.556013 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.556708 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zh4bc" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.556730 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.564642 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx"] Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.565754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.567822 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.575022 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx"] Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.586628 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/e5808d9b-074a-4948-8283-fdfea77c63bc-kube-api-access-n9mrv\") pod \"frr-k8s-webhook-server-bcc4b6f68-7jsnx\" (UID: \"e5808d9b-074a-4948-8283-fdfea77c63bc\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.586700 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-sockets\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.586721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-conf\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.586818 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.586850 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb66z\" (UniqueName: \"kubernetes.io/projected/530a34fb-bf82-4a2e-afdd-ec646afdebcd-kube-api-access-rb66z\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.586870 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5808d9b-074a-4948-8283-fdfea77c63bc-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7jsnx\" (UID: \"e5808d9b-074a-4948-8283-fdfea77c63bc\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.586986 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics-certs\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.587016 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-startup\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.587130 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-reloader\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.630885 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cpw6m"] Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.631751 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.633129 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.633887 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.633909 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.634570 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-h6mzq" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.649061 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-qq976"] Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.650190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.651748 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.674251 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qq976"] Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688472 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwcj\" (UniqueName: \"kubernetes.io/projected/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-kube-api-access-7rwcj\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/e5808d9b-074a-4948-8283-fdfea77c63bc-kube-api-access-n9mrv\") pod \"frr-k8s-webhook-server-bcc4b6f68-7jsnx\" (UID: \"e5808d9b-074a-4948-8283-fdfea77c63bc\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688563 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-sockets\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688588 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-conf\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb66z\" (UniqueName: \"kubernetes.io/projected/530a34fb-bf82-4a2e-afdd-ec646afdebcd-kube-api-access-rb66z\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688682 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5808d9b-074a-4948-8283-fdfea77c63bc-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7jsnx\" (UID: \"e5808d9b-074a-4948-8283-fdfea77c63bc\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-metrics-certs\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688750 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-metallb-excludel2\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688772 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjn5t\" (UniqueName: \"kubernetes.io/projected/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-kube-api-access-gjn5t\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688797 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-cert\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688826 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688860 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-metrics-certs\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688892 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics-certs\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-startup\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-reloader\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.688974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-conf\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.688985 4775 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.689021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.689044 4775 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.689036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-sockets\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.689094 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics-certs podName:530a34fb-bf82-4a2e-afdd-ec646afdebcd nodeName:}" failed. No retries permitted until 2026-03-21 05:02:25.189076926 +0000 UTC m=+898.165540540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics-certs") pod "frr-k8s-hmmgn" (UID: "530a34fb-bf82-4a2e-afdd-ec646afdebcd") : secret "frr-k8s-certs-secret" not found Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.689129 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5808d9b-074a-4948-8283-fdfea77c63bc-cert podName:e5808d9b-074a-4948-8283-fdfea77c63bc nodeName:}" failed. No retries permitted until 2026-03-21 05:02:25.189105726 +0000 UTC m=+898.165569350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5808d9b-074a-4948-8283-fdfea77c63bc-cert") pod "frr-k8s-webhook-server-bcc4b6f68-7jsnx" (UID: "e5808d9b-074a-4948-8283-fdfea77c63bc") : secret "frr-k8s-webhook-server-cert" not found Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.689253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/530a34fb-bf82-4a2e-afdd-ec646afdebcd-reloader\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.689883 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/530a34fb-bf82-4a2e-afdd-ec646afdebcd-frr-startup\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.712810 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9mrv\" (UniqueName: \"kubernetes.io/projected/e5808d9b-074a-4948-8283-fdfea77c63bc-kube-api-access-n9mrv\") pod \"frr-k8s-webhook-server-bcc4b6f68-7jsnx\" (UID: \"e5808d9b-074a-4948-8283-fdfea77c63bc\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.720659 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb66z\" (UniqueName: \"kubernetes.io/projected/530a34fb-bf82-4a2e-afdd-ec646afdebcd-kube-api-access-rb66z\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.789736 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-metallb-excludel2\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.789786 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjn5t\" (UniqueName: \"kubernetes.io/projected/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-kube-api-access-gjn5t\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.789808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-cert\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.789831 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.789849 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-metrics-certs\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.789920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwcj\" (UniqueName: \"kubernetes.io/projected/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-kube-api-access-7rwcj\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.789986 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-metrics-certs\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.790045 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 05:02:24 crc kubenswrapper[4775]: E0321 05:02:24.790144 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist podName:1d9349cc-e186-40c5-bb71-c176ff4f0a0d nodeName:}" failed. No retries permitted until 2026-03-21 05:02:25.29010457 +0000 UTC m=+898.266568184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist") pod "speaker-cpw6m" (UID: "1d9349cc-e186-40c5-bb71-c176ff4f0a0d") : secret "metallb-memberlist" not found Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.791020 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-metallb-excludel2\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.793235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-metrics-certs\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.793677 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-cert\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.796586 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-metrics-certs\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.809581 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwcj\" (UniqueName: \"kubernetes.io/projected/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-kube-api-access-7rwcj\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.811923 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjn5t\" (UniqueName: \"kubernetes.io/projected/ac0be1f3-95f0-40a4-9a94-c74cdaad9590-kube-api-access-gjn5t\") pod \"controller-7bb4cc7c98-qq976\" (UID: \"ac0be1f3-95f0-40a4-9a94-c74cdaad9590\") " pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:24 crc kubenswrapper[4775]: I0321 05:02:24.977894 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.195347 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics-certs\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.196455 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5808d9b-074a-4948-8283-fdfea77c63bc-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7jsnx\" (UID: \"e5808d9b-074a-4948-8283-fdfea77c63bc\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.199846 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/530a34fb-bf82-4a2e-afdd-ec646afdebcd-metrics-certs\") pod \"frr-k8s-hmmgn\" (UID: \"530a34fb-bf82-4a2e-afdd-ec646afdebcd\") " pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.199944 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5808d9b-074a-4948-8283-fdfea77c63bc-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7jsnx\" (UID: \"e5808d9b-074a-4948-8283-fdfea77c63bc\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.297161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:25 crc kubenswrapper[4775]: E0321 05:02:25.297325 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 05:02:25 crc kubenswrapper[4775]: E0321 05:02:25.297379 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist podName:1d9349cc-e186-40c5-bb71-c176ff4f0a0d nodeName:}" failed. No retries permitted until 2026-03-21 05:02:26.29736365 +0000 UTC m=+899.273827284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist") pod "speaker-cpw6m" (UID: "1d9349cc-e186-40c5-bb71-c176ff4f0a0d") : secret "metallb-memberlist" not found Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.412038 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qq976"] Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.480632 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.486551 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:25 crc kubenswrapper[4775]: I0321 05:02:25.744046 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx"] Mar 21 05:02:25 crc kubenswrapper[4775]: W0321 05:02:25.758847 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5808d9b_074a_4948_8283_fdfea77c63bc.slice/crio-cb8df94439fc36f44eb6b9af7bbbaeee29cb06bd6c4d6c4f6cda37fa4cedab88 WatchSource:0}: Error finding container cb8df94439fc36f44eb6b9af7bbbaeee29cb06bd6c4d6c4f6cda37fa4cedab88: Status 404 returned error can't find the container with id cb8df94439fc36f44eb6b9af7bbbaeee29cb06bd6c4d6c4f6cda37fa4cedab88 Mar 21 05:02:26 crc kubenswrapper[4775]: I0321 05:02:26.072426 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" event={"ID":"e5808d9b-074a-4948-8283-fdfea77c63bc","Type":"ContainerStarted","Data":"cb8df94439fc36f44eb6b9af7bbbaeee29cb06bd6c4d6c4f6cda37fa4cedab88"} Mar 21 05:02:26 crc kubenswrapper[4775]: I0321 05:02:26.073752 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qq976" event={"ID":"ac0be1f3-95f0-40a4-9a94-c74cdaad9590","Type":"ContainerStarted","Data":"a7d8aee5eb9ff6cafdeb4afc831b28835e283677be48c217812f06959791a63b"} Mar 21 05:02:26 crc kubenswrapper[4775]: I0321 05:02:26.073797 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qq976" event={"ID":"ac0be1f3-95f0-40a4-9a94-c74cdaad9590","Type":"ContainerStarted","Data":"de8493125539d9d7f5ce1ea7208ef751013cd2467f41b90d3d349712234d4cd4"} Mar 21 05:02:26 crc kubenswrapper[4775]: I0321 05:02:26.075816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerStarted","Data":"89337512b552921c50f81be5c9f601e1c5d7a75e332342a9762c46267d45f24b"} Mar 21 05:02:26 crc kubenswrapper[4775]: I0321 05:02:26.307963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:26 crc kubenswrapper[4775]: I0321 05:02:26.313540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1d9349cc-e186-40c5-bb71-c176ff4f0a0d-memberlist\") pod \"speaker-cpw6m\" (UID: \"1d9349cc-e186-40c5-bb71-c176ff4f0a0d\") " pod="metallb-system/speaker-cpw6m" Mar 21 05:02:26 crc kubenswrapper[4775]: I0321 05:02:26.446575 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cpw6m" Mar 21 05:02:26 crc kubenswrapper[4775]: W0321 05:02:26.479073 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d9349cc_e186_40c5_bb71_c176ff4f0a0d.slice/crio-94e56210d51a7672e1ae716a39d6eef54de6d76a15c7e07a6fff8c737a98462e WatchSource:0}: Error finding container 94e56210d51a7672e1ae716a39d6eef54de6d76a15c7e07a6fff8c737a98462e: Status 404 returned error can't find the container with id 94e56210d51a7672e1ae716a39d6eef54de6d76a15c7e07a6fff8c737a98462e Mar 21 05:02:27 crc kubenswrapper[4775]: I0321 05:02:27.082667 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qq976" event={"ID":"ac0be1f3-95f0-40a4-9a94-c74cdaad9590","Type":"ContainerStarted","Data":"5cb3f47ea90c47d8cc0e49911dfa2b8add574ff8d691407a25c8af4fd435cd17"} Mar 21 05:02:27 crc kubenswrapper[4775]: I0321 05:02:27.083141 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:27 crc kubenswrapper[4775]: I0321 05:02:27.084518 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cpw6m" event={"ID":"1d9349cc-e186-40c5-bb71-c176ff4f0a0d","Type":"ContainerStarted","Data":"94e56210d51a7672e1ae716a39d6eef54de6d76a15c7e07a6fff8c737a98462e"} Mar 21 05:02:27 crc kubenswrapper[4775]: I0321 05:02:27.099519 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-qq976" podStartSLOduration=3.099504323 podStartE2EDuration="3.099504323s" podCreationTimestamp="2026-03-21 05:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:02:27.097757364 +0000 UTC m=+900.074220988" watchObservedRunningTime="2026-03-21 05:02:27.099504323 +0000 UTC m=+900.075967947" Mar 21 05:02:28 crc kubenswrapper[4775]: I0321 05:02:28.128774 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cpw6m" event={"ID":"1d9349cc-e186-40c5-bb71-c176ff4f0a0d","Type":"ContainerStarted","Data":"90c553183b0c8e95182d23edacc52df10433cabc4d0076042525c078b0df1f45"} Mar 21 05:02:28 crc kubenswrapper[4775]: I0321 05:02:28.129198 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cpw6m" event={"ID":"1d9349cc-e186-40c5-bb71-c176ff4f0a0d","Type":"ContainerStarted","Data":"a5c54ae33d6f574b7e40b16d5dd7c88eff1c53c7d889c60ba90b1df5580a637e"} Mar 21 05:02:28 crc kubenswrapper[4775]: I0321 05:02:28.153529 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cpw6m" podStartSLOduration=4.153513779 podStartE2EDuration="4.153513779s" podCreationTimestamp="2026-03-21 05:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:02:28.150221046 +0000 UTC m=+901.126684690" watchObservedRunningTime="2026-03-21 05:02:28.153513779 +0000 UTC m=+901.129977403" Mar 21 05:02:29 crc kubenswrapper[4775]: I0321 05:02:29.145347 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cpw6m" Mar 21 05:02:34 crc kubenswrapper[4775]: I0321 05:02:34.179469 4775 generic.go:334] "Generic (PLEG): container finished" podID="530a34fb-bf82-4a2e-afdd-ec646afdebcd" containerID="54e259d8125bddf7796cb32d7ca683f2f4184703473bd38e4b6a78ad9f9526e6" exitCode=0 Mar 21 05:02:34 crc kubenswrapper[4775]: I0321 05:02:34.179529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerDied","Data":"54e259d8125bddf7796cb32d7ca683f2f4184703473bd38e4b6a78ad9f9526e6"} Mar 21 05:02:34 crc kubenswrapper[4775]: I0321 05:02:34.181916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" event={"ID":"e5808d9b-074a-4948-8283-fdfea77c63bc","Type":"ContainerStarted","Data":"8f1a6980a0fec05028213e7a314e298e7e837a36ae032942202c882dc6f3a3b3"} Mar 21 05:02:34 crc kubenswrapper[4775]: I0321 05:02:34.182325 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:34 crc kubenswrapper[4775]: I0321 05:02:34.225004 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" podStartSLOduration=2.867722845 podStartE2EDuration="10.224983585s" podCreationTimestamp="2026-03-21 05:02:24 +0000 UTC" firstStartedPulling="2026-03-21 05:02:25.76564171 +0000 UTC m=+898.742105334" lastFinishedPulling="2026-03-21 05:02:33.12290245 +0000 UTC m=+906.099366074" observedRunningTime="2026-03-21 05:02:34.215880548 +0000 UTC m=+907.192344162" watchObservedRunningTime="2026-03-21 05:02:34.224983585 +0000 UTC m=+907.201447209" Mar 21 05:02:35 crc kubenswrapper[4775]: I0321 05:02:35.189868 4775 generic.go:334] "Generic (PLEG): container finished" podID="530a34fb-bf82-4a2e-afdd-ec646afdebcd" containerID="145d9a2ceda338b7345a9b0fa7c873ac6f653688019db2c535c7a4c272788f3b" exitCode=0 Mar 21 05:02:35 crc kubenswrapper[4775]: I0321 05:02:35.189967 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerDied","Data":"145d9a2ceda338b7345a9b0fa7c873ac6f653688019db2c535c7a4c272788f3b"} Mar 21 05:02:36 crc kubenswrapper[4775]: I0321 05:02:36.197907 4775 generic.go:334] "Generic (PLEG): container finished" podID="530a34fb-bf82-4a2e-afdd-ec646afdebcd" containerID="0c54e8f5c1e6b7dea5786eaa31e7e8687b528834571742012096886361314ce6" exitCode=0 Mar 21 05:02:36 crc kubenswrapper[4775]: I0321 05:02:36.197959 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerDied","Data":"0c54e8f5c1e6b7dea5786eaa31e7e8687b528834571742012096886361314ce6"} Mar 21 05:02:37 crc kubenswrapper[4775]: I0321 05:02:37.208816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerStarted","Data":"9c42a35f97ea0ba130660183ffba8f33c9749b8aebc6913493caf59c6dc4cea6"} Mar 21 05:02:37 crc kubenswrapper[4775]: I0321 05:02:37.208862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerStarted","Data":"9a626c5ffdaa4b0c5c78534d88a49bfc2781bd49f4ff6c2a0cf10147c501a196"} Mar 21 05:02:37 crc kubenswrapper[4775]: I0321 05:02:37.208871 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerStarted","Data":"fe0785423bed047ecc10f06350efd59ec25ba622f5842aebec09a977f496e536"} Mar 21 05:02:37 crc kubenswrapper[4775]: I0321 05:02:37.208882 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerStarted","Data":"d0d752395ccc084e37cf5240f02979af8e6401976260d9b82ba569336db0be16"} Mar 21 05:02:37 crc kubenswrapper[4775]: I0321 05:02:37.296978 4775 scope.go:117] "RemoveContainer" containerID="67295240b136b0bcf5b3e83b034399ed927b48c5da3ccf9341e8f2fa7f873c3a" Mar 21 05:02:38 crc kubenswrapper[4775]: I0321 05:02:38.219540 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerStarted","Data":"e53eb5608fb39d95b16699dfb80d0b50647087d0f2a635ee454e89cb381dea5a"} Mar 21 05:02:38 crc kubenswrapper[4775]: I0321 05:02:38.219801 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hmmgn" event={"ID":"530a34fb-bf82-4a2e-afdd-ec646afdebcd","Type":"ContainerStarted","Data":"b8d1c6e8deb07620cfda015d5497bf040582fad0feac2d48c532974f3c04be9b"} Mar 21 05:02:38 crc kubenswrapper[4775]: I0321 05:02:38.219838 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:38 crc kubenswrapper[4775]: I0321 05:02:38.247380 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hmmgn" podStartSLOduration=6.968295908 podStartE2EDuration="14.247360679s" podCreationTimestamp="2026-03-21 05:02:24 +0000 UTC" firstStartedPulling="2026-03-21 05:02:25.820835399 +0000 UTC m=+898.797299023" lastFinishedPulling="2026-03-21 05:02:33.09990017 +0000 UTC m=+906.076363794" observedRunningTime="2026-03-21 05:02:38.244473227 +0000 UTC m=+911.220936861" watchObservedRunningTime="2026-03-21 05:02:38.247360679 +0000 UTC m=+911.223824323" Mar 21 05:02:40 crc kubenswrapper[4775]: I0321 05:02:40.481085 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:40 crc kubenswrapper[4775]: I0321 05:02:40.539745 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:44 crc kubenswrapper[4775]: I0321 05:02:44.996972 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-qq976" Mar 21 05:02:45 crc kubenswrapper[4775]: I0321 05:02:45.492555 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7jsnx" Mar 21 05:02:46 crc kubenswrapper[4775]: I0321 05:02:46.450238 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cpw6m" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.362666 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7tr45"] Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.364162 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tr45" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.367504 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.369283 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.369541 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-4vkm5" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.373934 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7tr45"] Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.446022 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42z9n\" (UniqueName: \"kubernetes.io/projected/c55e2ca9-69fd-41be-a5d8-838cdf259c0d-kube-api-access-42z9n\") pod \"openstack-operator-index-7tr45\" (UID: \"c55e2ca9-69fd-41be-a5d8-838cdf259c0d\") " pod="openstack-operators/openstack-operator-index-7tr45" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.547340 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42z9n\" (UniqueName: \"kubernetes.io/projected/c55e2ca9-69fd-41be-a5d8-838cdf259c0d-kube-api-access-42z9n\") pod \"openstack-operator-index-7tr45\" (UID: \"c55e2ca9-69fd-41be-a5d8-838cdf259c0d\") " pod="openstack-operators/openstack-operator-index-7tr45" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.575828 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42z9n\" (UniqueName: \"kubernetes.io/projected/c55e2ca9-69fd-41be-a5d8-838cdf259c0d-kube-api-access-42z9n\") pod \"openstack-operator-index-7tr45\" (UID: \"c55e2ca9-69fd-41be-a5d8-838cdf259c0d\") " pod="openstack-operators/openstack-operator-index-7tr45" Mar 21 05:02:52 crc kubenswrapper[4775]: I0321 05:02:52.682093 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tr45" Mar 21 05:02:53 crc kubenswrapper[4775]: I0321 05:02:53.092862 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7tr45"] Mar 21 05:02:53 crc kubenswrapper[4775]: I0321 05:02:53.309740 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tr45" event={"ID":"c55e2ca9-69fd-41be-a5d8-838cdf259c0d","Type":"ContainerStarted","Data":"995a3cedd5dcd750f59c7675e4bd7b49d6165f4247ebdf791eccf8293e7782e5"} Mar 21 05:02:55 crc kubenswrapper[4775]: I0321 05:02:55.483726 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hmmgn" Mar 21 05:02:57 crc kubenswrapper[4775]: I0321 05:02:57.340650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tr45" event={"ID":"c55e2ca9-69fd-41be-a5d8-838cdf259c0d","Type":"ContainerStarted","Data":"685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab"} Mar 21 05:02:57 crc kubenswrapper[4775]: I0321 05:02:57.362968 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7tr45" podStartSLOduration=2.163409864 podStartE2EDuration="5.362949941s" podCreationTimestamp="2026-03-21 05:02:52 +0000 UTC" firstStartedPulling="2026-03-21 05:02:53.109835956 +0000 UTC m=+926.086299590" lastFinishedPulling="2026-03-21 05:02:56.309376043 +0000 UTC m=+929.285839667" observedRunningTime="2026-03-21 05:02:57.360407079 +0000 UTC m=+930.336870703" watchObservedRunningTime="2026-03-21 05:02:57.362949941 +0000 UTC m=+930.339413575" Mar 21 05:02:57 crc kubenswrapper[4775]: I0321 05:02:57.749279 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7tr45"] Mar 21 05:02:58 crc kubenswrapper[4775]: I0321 05:02:58.359161 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hvtkg"] Mar 21 05:02:58 crc kubenswrapper[4775]: I0321 05:02:58.360023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:02:58 crc kubenswrapper[4775]: I0321 05:02:58.376417 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvtkg"] Mar 21 05:02:58 crc kubenswrapper[4775]: I0321 05:02:58.424686 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6hd\" (UniqueName: \"kubernetes.io/projected/8c7426e8-8cec-4c84-8810-03a091d87cd9-kube-api-access-kn6hd\") pod \"openstack-operator-index-hvtkg\" (UID: \"8c7426e8-8cec-4c84-8810-03a091d87cd9\") " pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:02:58 crc kubenswrapper[4775]: I0321 05:02:58.526452 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6hd\" (UniqueName: \"kubernetes.io/projected/8c7426e8-8cec-4c84-8810-03a091d87cd9-kube-api-access-kn6hd\") pod \"openstack-operator-index-hvtkg\" (UID: \"8c7426e8-8cec-4c84-8810-03a091d87cd9\") " pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:02:58 crc kubenswrapper[4775]: I0321 05:02:58.545871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6hd\" (UniqueName: \"kubernetes.io/projected/8c7426e8-8cec-4c84-8810-03a091d87cd9-kube-api-access-kn6hd\") pod \"openstack-operator-index-hvtkg\" (UID: \"8c7426e8-8cec-4c84-8810-03a091d87cd9\") " pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:02:58 crc kubenswrapper[4775]: I0321 05:02:58.677636 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.069680 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hvtkg"] Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.354262 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7tr45" podUID="c55e2ca9-69fd-41be-a5d8-838cdf259c0d" containerName="registry-server" containerID="cri-o://685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab" gracePeriod=2 Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.354823 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvtkg" event={"ID":"8c7426e8-8cec-4c84-8810-03a091d87cd9","Type":"ContainerStarted","Data":"5d31662d351f54fed982495f8f8148bc6d813e97a6151cf3f8cb22892e4542ad"} Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.354845 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hvtkg" event={"ID":"8c7426e8-8cec-4c84-8810-03a091d87cd9","Type":"ContainerStarted","Data":"7a48142b15e92f4c26c5e5751402fda8d00b456235bff782d4510e971251e495"} Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.382932 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hvtkg" podStartSLOduration=1.344728895 podStartE2EDuration="1.382908067s" podCreationTimestamp="2026-03-21 05:02:58 +0000 UTC" firstStartedPulling="2026-03-21 05:02:59.067788476 +0000 UTC m=+932.044252100" lastFinishedPulling="2026-03-21 05:02:59.105967648 +0000 UTC m=+932.082431272" observedRunningTime="2026-03-21 05:02:59.377141744 +0000 UTC m=+932.353605358" watchObservedRunningTime="2026-03-21 05:02:59.382908067 +0000 UTC m=+932.359371701" Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.726627 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tr45" Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.844715 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42z9n\" (UniqueName: \"kubernetes.io/projected/c55e2ca9-69fd-41be-a5d8-838cdf259c0d-kube-api-access-42z9n\") pod \"c55e2ca9-69fd-41be-a5d8-838cdf259c0d\" (UID: \"c55e2ca9-69fd-41be-a5d8-838cdf259c0d\") " Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.851362 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55e2ca9-69fd-41be-a5d8-838cdf259c0d-kube-api-access-42z9n" (OuterVolumeSpecName: "kube-api-access-42z9n") pod "c55e2ca9-69fd-41be-a5d8-838cdf259c0d" (UID: "c55e2ca9-69fd-41be-a5d8-838cdf259c0d"). InnerVolumeSpecName "kube-api-access-42z9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:59 crc kubenswrapper[4775]: I0321 05:02:59.946880 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42z9n\" (UniqueName: \"kubernetes.io/projected/c55e2ca9-69fd-41be-a5d8-838cdf259c0d-kube-api-access-42z9n\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.364208 4775 generic.go:334] "Generic (PLEG): container finished" podID="c55e2ca9-69fd-41be-a5d8-838cdf259c0d" containerID="685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab" exitCode=0 Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.364275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tr45" event={"ID":"c55e2ca9-69fd-41be-a5d8-838cdf259c0d","Type":"ContainerDied","Data":"685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab"} Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.364329 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7tr45" Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.364364 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7tr45" event={"ID":"c55e2ca9-69fd-41be-a5d8-838cdf259c0d","Type":"ContainerDied","Data":"995a3cedd5dcd750f59c7675e4bd7b49d6165f4247ebdf791eccf8293e7782e5"} Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.364400 4775 scope.go:117] "RemoveContainer" containerID="685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab" Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.390233 4775 scope.go:117] "RemoveContainer" containerID="685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab" Mar 21 05:03:00 crc kubenswrapper[4775]: E0321 05:03:00.391438 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab\": container with ID starting with 685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab not found: ID does not exist" containerID="685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab" Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.391505 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab"} err="failed to get container status \"685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab\": rpc error: code = NotFound desc = could not find container \"685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab\": container with ID starting with 685d7998a05afd3596f950c41f49e4d6db42d13a64fd7fd1656134c40c8626ab not found: ID does not exist" Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.405683 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7tr45"] Mar 21 05:03:00 crc kubenswrapper[4775]: I0321 05:03:00.414742 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7tr45"] Mar 21 05:03:01 crc kubenswrapper[4775]: I0321 05:03:01.675745 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55e2ca9-69fd-41be-a5d8-838cdf259c0d" path="/var/lib/kubelet/pods/c55e2ca9-69fd-41be-a5d8-838cdf259c0d/volumes" Mar 21 05:03:08 crc kubenswrapper[4775]: I0321 05:03:08.678379 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:03:08 crc kubenswrapper[4775]: I0321 05:03:08.679494 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:03:08 crc kubenswrapper[4775]: I0321 05:03:08.713402 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:03:09 crc kubenswrapper[4775]: I0321 05:03:09.440434 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hvtkg" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.701301 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt"] Mar 21 05:03:15 crc kubenswrapper[4775]: E0321 05:03:15.702030 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55e2ca9-69fd-41be-a5d8-838cdf259c0d" containerName="registry-server" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.702041 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55e2ca9-69fd-41be-a5d8-838cdf259c0d" containerName="registry-server" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.702171 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55e2ca9-69fd-41be-a5d8-838cdf259c0d" containerName="registry-server" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.702993 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.704904 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-8lvrj" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.708918 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt"] Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.760096 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jd54\" (UniqueName: \"kubernetes.io/projected/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-kube-api-access-7jd54\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.760163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-util\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.760211 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-bundle\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.861611 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jd54\" (UniqueName: \"kubernetes.io/projected/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-kube-api-access-7jd54\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.861661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-util\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.861695 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-bundle\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.862218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-bundle\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.862251 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-util\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:15 crc kubenswrapper[4775]: I0321 05:03:15.881306 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jd54\" (UniqueName: \"kubernetes.io/projected/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-kube-api-access-7jd54\") pod \"68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:16 crc kubenswrapper[4775]: I0321 05:03:16.021238 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:16 crc kubenswrapper[4775]: I0321 05:03:16.441806 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt"] Mar 21 05:03:16 crc kubenswrapper[4775]: I0321 05:03:16.460513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" event={"ID":"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1","Type":"ContainerStarted","Data":"70feef91658f1da99ff064bdbcec3d2427164483c135492b6403332167eba6c0"} Mar 21 05:03:17 crc kubenswrapper[4775]: I0321 05:03:17.468607 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerID="817da2dd74a7e5323813c63e67bc39fdde83314b3f9d77ceb23de398e14345ed" exitCode=0 Mar 21 05:03:17 crc kubenswrapper[4775]: I0321 05:03:17.468694 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" event={"ID":"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1","Type":"ContainerDied","Data":"817da2dd74a7e5323813c63e67bc39fdde83314b3f9d77ceb23de398e14345ed"} Mar 21 05:03:17 crc kubenswrapper[4775]: I0321 05:03:17.470950 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:03:18 crc kubenswrapper[4775]: I0321 05:03:18.477507 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerID="2499d2237f0079f5f8fec2d33b699971213e7f38d15de3fca71a9a5053917b17" exitCode=0 Mar 21 05:03:18 crc kubenswrapper[4775]: I0321 05:03:18.478043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" event={"ID":"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1","Type":"ContainerDied","Data":"2499d2237f0079f5f8fec2d33b699971213e7f38d15de3fca71a9a5053917b17"} Mar 21 05:03:19 crc kubenswrapper[4775]: I0321 05:03:19.487415 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerID="22f4582caaafb23bc51a920330711510038ca7554591e01661392f782fa03521" exitCode=0 Mar 21 05:03:19 crc kubenswrapper[4775]: I0321 05:03:19.487463 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" event={"ID":"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1","Type":"ContainerDied","Data":"22f4582caaafb23bc51a920330711510038ca7554591e01661392f782fa03521"} Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.763900 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.862961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-bundle\") pod \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.863374 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jd54\" (UniqueName: \"kubernetes.io/projected/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-kube-api-access-7jd54\") pod \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.863404 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-util\") pod \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\" (UID: \"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1\") " Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.864355 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-bundle" (OuterVolumeSpecName: "bundle") pod "a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" (UID: "a8157e0e-bf83-4dbd-af86-d09d14e6e1b1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.882925 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-util" (OuterVolumeSpecName: "util") pod "a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" (UID: "a8157e0e-bf83-4dbd-af86-d09d14e6e1b1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.883340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-kube-api-access-7jd54" (OuterVolumeSpecName: "kube-api-access-7jd54") pod "a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" (UID: "a8157e0e-bf83-4dbd-af86-d09d14e6e1b1"). InnerVolumeSpecName "kube-api-access-7jd54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.964573 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.964606 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jd54\" (UniqueName: \"kubernetes.io/projected/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-kube-api-access-7jd54\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:20 crc kubenswrapper[4775]: I0321 05:03:20.964616 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8157e0e-bf83-4dbd-af86-d09d14e6e1b1-util\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:21 crc kubenswrapper[4775]: I0321 05:03:21.502861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" event={"ID":"a8157e0e-bf83-4dbd-af86-d09d14e6e1b1","Type":"ContainerDied","Data":"70feef91658f1da99ff064bdbcec3d2427164483c135492b6403332167eba6c0"} Mar 21 05:03:21 crc kubenswrapper[4775]: I0321 05:03:21.502903 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70feef91658f1da99ff064bdbcec3d2427164483c135492b6403332167eba6c0" Mar 21 05:03:21 crc kubenswrapper[4775]: I0321 05:03:21.503012 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.413965 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k"] Mar 21 05:03:28 crc kubenswrapper[4775]: E0321 05:03:28.414975 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerName="pull" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.414998 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerName="pull" Mar 21 05:03:28 crc kubenswrapper[4775]: E0321 05:03:28.415019 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerName="extract" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.415030 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerName="extract" Mar 21 05:03:28 crc kubenswrapper[4775]: E0321 05:03:28.415047 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerName="util" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.415060 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerName="util" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.415287 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8157e0e-bf83-4dbd-af86-d09d14e6e1b1" containerName="extract" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.415915 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.418420 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-2cms9" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.456923 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k"] Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.568670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9shlv\" (UniqueName: \"kubernetes.io/projected/899f7d20-7208-419e-b0f8-36c7fbf2e841-kube-api-access-9shlv\") pod \"openstack-operator-controller-init-85fcfb8fbb-q2k4k\" (UID: \"899f7d20-7208-419e-b0f8-36c7fbf2e841\") " pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.669692 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9shlv\" (UniqueName: \"kubernetes.io/projected/899f7d20-7208-419e-b0f8-36c7fbf2e841-kube-api-access-9shlv\") pod \"openstack-operator-controller-init-85fcfb8fbb-q2k4k\" (UID: \"899f7d20-7208-419e-b0f8-36c7fbf2e841\") " pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.698452 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9shlv\" (UniqueName: \"kubernetes.io/projected/899f7d20-7208-419e-b0f8-36c7fbf2e841-kube-api-access-9shlv\") pod \"openstack-operator-controller-init-85fcfb8fbb-q2k4k\" (UID: \"899f7d20-7208-419e-b0f8-36c7fbf2e841\") " pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" Mar 21 05:03:28 crc kubenswrapper[4775]: I0321 05:03:28.737059 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" Mar 21 05:03:29 crc kubenswrapper[4775]: I0321 05:03:29.166655 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k"] Mar 21 05:03:29 crc kubenswrapper[4775]: I0321 05:03:29.553159 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" event={"ID":"899f7d20-7208-419e-b0f8-36c7fbf2e841","Type":"ContainerStarted","Data":"e8d898aaf010ff6001c1f8bad0dc77465c515e5a320cf98c275b33204e0f44d0"} Mar 21 05:03:32 crc kubenswrapper[4775]: I0321 05:03:32.482821 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:03:32 crc kubenswrapper[4775]: I0321 05:03:32.483289 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:03:33 crc kubenswrapper[4775]: I0321 05:03:33.578241 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" event={"ID":"899f7d20-7208-419e-b0f8-36c7fbf2e841","Type":"ContainerStarted","Data":"7ee4cc8250444108e2a58c00da5799e51014a6f93fdd0496dab8bcdab841d184"} Mar 21 05:03:33 crc kubenswrapper[4775]: I0321 05:03:33.578581 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" Mar 21 05:03:33 crc kubenswrapper[4775]: I0321 05:03:33.606152 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" podStartSLOduration=2.257313755 podStartE2EDuration="5.606135892s" podCreationTimestamp="2026-03-21 05:03:28 +0000 UTC" firstStartedPulling="2026-03-21 05:03:29.181394664 +0000 UTC m=+962.157858288" lastFinishedPulling="2026-03-21 05:03:32.530216801 +0000 UTC m=+965.506680425" observedRunningTime="2026-03-21 05:03:33.602186601 +0000 UTC m=+966.578650225" watchObservedRunningTime="2026-03-21 05:03:33.606135892 +0000 UTC m=+966.582599516" Mar 21 05:03:38 crc kubenswrapper[4775]: I0321 05:03:38.740627 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-85fcfb8fbb-q2k4k" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.461003 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vd7j8"] Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.462588 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.478237 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd7j8"] Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.619517 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvjj\" (UniqueName: \"kubernetes.io/projected/0f51c821-d0a6-4b50-886c-60b46cef4f4e-kube-api-access-8pvjj\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.619589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-utilities\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.619611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-catalog-content\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.721163 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-utilities\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.721232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-catalog-content\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.721334 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvjj\" (UniqueName: \"kubernetes.io/projected/0f51c821-d0a6-4b50-886c-60b46cef4f4e-kube-api-access-8pvjj\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.721744 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-utilities\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.721780 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-catalog-content\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.742632 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvjj\" (UniqueName: \"kubernetes.io/projected/0f51c821-d0a6-4b50-886c-60b46cef4f4e-kube-api-access-8pvjj\") pod \"redhat-marketplace-vd7j8\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:56 crc kubenswrapper[4775]: I0321 05:03:56.780718 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:03:57 crc kubenswrapper[4775]: I0321 05:03:57.302809 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd7j8"] Mar 21 05:03:57 crc kubenswrapper[4775]: I0321 05:03:57.735088 4775 generic.go:334] "Generic (PLEG): container finished" podID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerID="878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b" exitCode=0 Mar 21 05:03:57 crc kubenswrapper[4775]: I0321 05:03:57.735150 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd7j8" event={"ID":"0f51c821-d0a6-4b50-886c-60b46cef4f4e","Type":"ContainerDied","Data":"878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b"} Mar 21 05:03:57 crc kubenswrapper[4775]: I0321 05:03:57.735434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd7j8" event={"ID":"0f51c821-d0a6-4b50-886c-60b46cef4f4e","Type":"ContainerStarted","Data":"f95851be8349176e3d371bd1b072ead3e10266f7c5f76bdcbc17b6e01db6f9b1"} Mar 21 05:03:58 crc kubenswrapper[4775]: I0321 05:03:58.743746 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd7j8" event={"ID":"0f51c821-d0a6-4b50-886c-60b46cef4f4e","Type":"ContainerStarted","Data":"e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3"} Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.243513 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6nhhl"] Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.254239 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.266662 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6nhhl"] Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.363963 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6ln\" (UniqueName: \"kubernetes.io/projected/694f1c66-ce56-4854-9f4c-02a42e16046d-kube-api-access-xg6ln\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.364247 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-utilities\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.364369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-catalog-content\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.465433 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6ln\" (UniqueName: \"kubernetes.io/projected/694f1c66-ce56-4854-9f4c-02a42e16046d-kube-api-access-xg6ln\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.465506 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-utilities\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.465525 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-catalog-content\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.465913 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-catalog-content\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.466171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-utilities\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.484401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6ln\" (UniqueName: \"kubernetes.io/projected/694f1c66-ce56-4854-9f4c-02a42e16046d-kube-api-access-xg6ln\") pod \"community-operators-6nhhl\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.574877 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.757381 4775 generic.go:334] "Generic (PLEG): container finished" podID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerID="e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3" exitCode=0 Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.757431 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd7j8" event={"ID":"0f51c821-d0a6-4b50-886c-60b46cef4f4e","Type":"ContainerDied","Data":"e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3"} Mar 21 05:03:59 crc kubenswrapper[4775]: I0321 05:03:59.968007 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6nhhl"] Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.164458 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567824-jsc7j"] Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.165686 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-jsc7j" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.173807 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.173897 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.173811 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.183066 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s48l\" (UniqueName: \"kubernetes.io/projected/c861c77c-15ba-4204-8603-e6093bc1a0b8-kube-api-access-7s48l\") pod \"auto-csr-approver-29567824-jsc7j\" (UID: \"c861c77c-15ba-4204-8603-e6093bc1a0b8\") " pod="openshift-infra/auto-csr-approver-29567824-jsc7j" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.187392 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-jsc7j"] Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.284247 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s48l\" (UniqueName: \"kubernetes.io/projected/c861c77c-15ba-4204-8603-e6093bc1a0b8-kube-api-access-7s48l\") pod \"auto-csr-approver-29567824-jsc7j\" (UID: \"c861c77c-15ba-4204-8603-e6093bc1a0b8\") " pod="openshift-infra/auto-csr-approver-29567824-jsc7j" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.304424 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s48l\" (UniqueName: \"kubernetes.io/projected/c861c77c-15ba-4204-8603-e6093bc1a0b8-kube-api-access-7s48l\") pod \"auto-csr-approver-29567824-jsc7j\" (UID: \"c861c77c-15ba-4204-8603-e6093bc1a0b8\") " pod="openshift-infra/auto-csr-approver-29567824-jsc7j" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.489233 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-jsc7j" Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.710789 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-jsc7j"] Mar 21 05:04:00 crc kubenswrapper[4775]: W0321 05:04:00.714625 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc861c77c_15ba_4204_8603_e6093bc1a0b8.slice/crio-740ad394c5df4828658e1735d663bd2c0776d52184fe75fdf22a2c8c653e68ed WatchSource:0}: Error finding container 740ad394c5df4828658e1735d663bd2c0776d52184fe75fdf22a2c8c653e68ed: Status 404 returned error can't find the container with id 740ad394c5df4828658e1735d663bd2c0776d52184fe75fdf22a2c8c653e68ed Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.765137 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerStarted","Data":"5780c596a9174ce034338976560a433a9f2ac7cb3aacc03fbb37bc5801d6945f"} Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.765200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerStarted","Data":"3f6120ae96f0b10b5bf76bd3efa458fabb2c0680a195b83d1f67570f61231bb6"} Mar 21 05:04:00 crc kubenswrapper[4775]: I0321 05:04:00.767008 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-jsc7j" event={"ID":"c861c77c-15ba-4204-8603-e6093bc1a0b8","Type":"ContainerStarted","Data":"740ad394c5df4828658e1735d663bd2c0776d52184fe75fdf22a2c8c653e68ed"} Mar 21 05:04:01 crc kubenswrapper[4775]: I0321 05:04:01.775586 4775 generic.go:334] "Generic (PLEG): container finished" podID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerID="5780c596a9174ce034338976560a433a9f2ac7cb3aacc03fbb37bc5801d6945f" exitCode=0 Mar 21 05:04:01 crc kubenswrapper[4775]: I0321 05:04:01.775643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerDied","Data":"5780c596a9174ce034338976560a433a9f2ac7cb3aacc03fbb37bc5801d6945f"} Mar 21 05:04:01 crc kubenswrapper[4775]: I0321 05:04:01.778336 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd7j8" event={"ID":"0f51c821-d0a6-4b50-886c-60b46cef4f4e","Type":"ContainerStarted","Data":"81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c"} Mar 21 05:04:01 crc kubenswrapper[4775]: I0321 05:04:01.813168 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vd7j8" podStartSLOduration=2.781509761 podStartE2EDuration="5.813100508s" podCreationTimestamp="2026-03-21 05:03:56 +0000 UTC" firstStartedPulling="2026-03-21 05:03:57.737308118 +0000 UTC m=+990.713771742" lastFinishedPulling="2026-03-21 05:04:00.768898865 +0000 UTC m=+993.745362489" observedRunningTime="2026-03-21 05:04:01.810052581 +0000 UTC m=+994.786516195" watchObservedRunningTime="2026-03-21 05:04:01.813100508 +0000 UTC m=+994.789564132" Mar 21 05:04:02 crc kubenswrapper[4775]: I0321 05:04:02.482777 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:04:02 crc kubenswrapper[4775]: I0321 05:04:02.482832 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:04:02 crc kubenswrapper[4775]: I0321 05:04:02.785742 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerStarted","Data":"428647d2139b15f3ab85f6734bc25c0214e38d663b4a1c7cb9b69a834fdbe970"} Mar 21 05:04:02 crc kubenswrapper[4775]: I0321 05:04:02.787650 4775 generic.go:334] "Generic (PLEG): container finished" podID="c861c77c-15ba-4204-8603-e6093bc1a0b8" containerID="ca9c08541581bc07e396b682f2ba269a753d458f0a5cc3ff5f26ace1828dea81" exitCode=0 Mar 21 05:04:02 crc kubenswrapper[4775]: I0321 05:04:02.787696 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-jsc7j" event={"ID":"c861c77c-15ba-4204-8603-e6093bc1a0b8","Type":"ContainerDied","Data":"ca9c08541581bc07e396b682f2ba269a753d458f0a5cc3ff5f26ace1828dea81"} Mar 21 05:04:03 crc kubenswrapper[4775]: I0321 05:04:03.794449 4775 generic.go:334] "Generic (PLEG): container finished" podID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerID="428647d2139b15f3ab85f6734bc25c0214e38d663b4a1c7cb9b69a834fdbe970" exitCode=0 Mar 21 05:04:03 crc kubenswrapper[4775]: I0321 05:04:03.795256 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerDied","Data":"428647d2139b15f3ab85f6734bc25c0214e38d663b4a1c7cb9b69a834fdbe970"} Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.050861 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-jsc7j" Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.245674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s48l\" (UniqueName: \"kubernetes.io/projected/c861c77c-15ba-4204-8603-e6093bc1a0b8-kube-api-access-7s48l\") pod \"c861c77c-15ba-4204-8603-e6093bc1a0b8\" (UID: \"c861c77c-15ba-4204-8603-e6093bc1a0b8\") " Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.252671 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c861c77c-15ba-4204-8603-e6093bc1a0b8-kube-api-access-7s48l" (OuterVolumeSpecName: "kube-api-access-7s48l") pod "c861c77c-15ba-4204-8603-e6093bc1a0b8" (UID: "c861c77c-15ba-4204-8603-e6093bc1a0b8"). InnerVolumeSpecName "kube-api-access-7s48l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.347362 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s48l\" (UniqueName: \"kubernetes.io/projected/c861c77c-15ba-4204-8603-e6093bc1a0b8-kube-api-access-7s48l\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.803160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerStarted","Data":"bfa25ad839c3791aa796fdbb1ca916c818ffc84d5a2bd52f3d19c4a9886bd0e1"} Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.805566 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-jsc7j" event={"ID":"c861c77c-15ba-4204-8603-e6093bc1a0b8","Type":"ContainerDied","Data":"740ad394c5df4828658e1735d663bd2c0776d52184fe75fdf22a2c8c653e68ed"} Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.805604 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="740ad394c5df4828658e1735d663bd2c0776d52184fe75fdf22a2c8c653e68ed" Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.805620 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-jsc7j" Mar 21 05:04:04 crc kubenswrapper[4775]: I0321 05:04:04.835873 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6nhhl" podStartSLOduration=3.344814295 podStartE2EDuration="5.835853343s" podCreationTimestamp="2026-03-21 05:03:59 +0000 UTC" firstStartedPulling="2026-03-21 05:04:01.776923962 +0000 UTC m=+994.753387596" lastFinishedPulling="2026-03-21 05:04:04.26796302 +0000 UTC m=+997.244426644" observedRunningTime="2026-03-21 05:04:04.835571075 +0000 UTC m=+997.812034699" watchObservedRunningTime="2026-03-21 05:04:04.835853343 +0000 UTC m=+997.812316967" Mar 21 05:04:05 crc kubenswrapper[4775]: I0321 05:04:05.139973 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-42hv5"] Mar 21 05:04:05 crc kubenswrapper[4775]: I0321 05:04:05.145719 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-42hv5"] Mar 21 05:04:05 crc kubenswrapper[4775]: I0321 05:04:05.669326 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cf5461-a1ca-40cb-9ed3-9c2de90faad3" path="/var/lib/kubelet/pods/40cf5461-a1ca-40cb-9ed3-9c2de90faad3/volumes" Mar 21 05:04:06 crc kubenswrapper[4775]: I0321 05:04:06.781428 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:04:06 crc kubenswrapper[4775]: I0321 05:04:06.782330 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:04:06 crc kubenswrapper[4775]: I0321 05:04:06.843411 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:04:07 crc kubenswrapper[4775]: I0321 05:04:07.869056 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:04:09 crc kubenswrapper[4775]: I0321 05:04:09.034987 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd7j8"] Mar 21 05:04:09 crc kubenswrapper[4775]: I0321 05:04:09.575191 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:04:09 crc kubenswrapper[4775]: I0321 05:04:09.575300 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:04:09 crc kubenswrapper[4775]: I0321 05:04:09.620247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:04:09 crc kubenswrapper[4775]: I0321 05:04:09.834031 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vd7j8" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="registry-server" containerID="cri-o://81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c" gracePeriod=2 Mar 21 05:04:09 crc kubenswrapper[4775]: I0321 05:04:09.877873 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.271332 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.328173 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-catalog-content\") pod \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.328221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pvjj\" (UniqueName: \"kubernetes.io/projected/0f51c821-d0a6-4b50-886c-60b46cef4f4e-kube-api-access-8pvjj\") pod \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.328267 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-utilities\") pod \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\" (UID: \"0f51c821-d0a6-4b50-886c-60b46cef4f4e\") " Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.329198 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-utilities" (OuterVolumeSpecName: "utilities") pod "0f51c821-d0a6-4b50-886c-60b46cef4f4e" (UID: "0f51c821-d0a6-4b50-886c-60b46cef4f4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.347616 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f51c821-d0a6-4b50-886c-60b46cef4f4e-kube-api-access-8pvjj" (OuterVolumeSpecName: "kube-api-access-8pvjj") pod "0f51c821-d0a6-4b50-886c-60b46cef4f4e" (UID: "0f51c821-d0a6-4b50-886c-60b46cef4f4e"). InnerVolumeSpecName "kube-api-access-8pvjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.360373 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f51c821-d0a6-4b50-886c-60b46cef4f4e" (UID: "0f51c821-d0a6-4b50-886c-60b46cef4f4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.430054 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.430093 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f51c821-d0a6-4b50-886c-60b46cef4f4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.430107 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pvjj\" (UniqueName: \"kubernetes.io/projected/0f51c821-d0a6-4b50-886c-60b46cef4f4e-kube-api-access-8pvjj\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.842976 4775 generic.go:334] "Generic (PLEG): container finished" podID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerID="81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c" exitCode=0 Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.843046 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd7j8" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.843087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd7j8" event={"ID":"0f51c821-d0a6-4b50-886c-60b46cef4f4e","Type":"ContainerDied","Data":"81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c"} Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.843140 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd7j8" event={"ID":"0f51c821-d0a6-4b50-886c-60b46cef4f4e","Type":"ContainerDied","Data":"f95851be8349176e3d371bd1b072ead3e10266f7c5f76bdcbc17b6e01db6f9b1"} Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.843162 4775 scope.go:117] "RemoveContainer" containerID="81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.869286 4775 scope.go:117] "RemoveContainer" containerID="e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.894252 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd7j8"] Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.900648 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd7j8"] Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.904087 4775 scope.go:117] "RemoveContainer" containerID="878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.922812 4775 scope.go:117] "RemoveContainer" containerID="81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c" Mar 21 05:04:10 crc kubenswrapper[4775]: E0321 05:04:10.923232 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c\": container with ID starting with 81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c not found: ID does not exist" containerID="81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.923261 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c"} err="failed to get container status \"81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c\": rpc error: code = NotFound desc = could not find container \"81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c\": container with ID starting with 81bc0fdd8eb47b55fa064c84d2bb340801a9116fe70f3e09b004ec5b7bda0d6c not found: ID does not exist" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.923281 4775 scope.go:117] "RemoveContainer" containerID="e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3" Mar 21 05:04:10 crc kubenswrapper[4775]: E0321 05:04:10.923475 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3\": container with ID starting with e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3 not found: ID does not exist" containerID="e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.923496 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3"} err="failed to get container status \"e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3\": rpc error: code = NotFound desc = could not find container \"e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3\": container with ID starting with e6ba57ef1cf15fe65d271c1c5af269adfaf901ca94cd5c198057f57301b19ca3 not found: ID does not exist" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.923513 4775 scope.go:117] "RemoveContainer" containerID="878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b" Mar 21 05:04:10 crc kubenswrapper[4775]: E0321 05:04:10.923686 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b\": container with ID starting with 878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b not found: ID does not exist" containerID="878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.923706 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b"} err="failed to get container status \"878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b\": rpc error: code = NotFound desc = could not find container \"878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b\": container with ID starting with 878591f3f3b4a3527080420c1c74c3cb1ee32f80a5d8b68aea2af52a45e65a6b not found: ID does not exist" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.998185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx"] Mar 21 05:04:10 crc kubenswrapper[4775]: E0321 05:04:10.998422 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="extract-content" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.998437 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="extract-content" Mar 21 05:04:10 crc kubenswrapper[4775]: E0321 05:04:10.998448 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c861c77c-15ba-4204-8603-e6093bc1a0b8" containerName="oc" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.998454 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c861c77c-15ba-4204-8603-e6093bc1a0b8" containerName="oc" Mar 21 05:04:10 crc kubenswrapper[4775]: E0321 05:04:10.998464 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="registry-server" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.998469 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="registry-server" Mar 21 05:04:10 crc kubenswrapper[4775]: E0321 05:04:10.998489 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="extract-utilities" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.998495 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="extract-utilities" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.998637 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c861c77c-15ba-4204-8603-e6093bc1a0b8" containerName="oc" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.998651 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" containerName="registry-server" Mar 21 05:04:10 crc kubenswrapper[4775]: I0321 05:04:10.999042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.002042 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-n6256" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.007754 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.009060 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.011563 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bjdr6" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.018895 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.028128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.036834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l564\" (UniqueName: \"kubernetes.io/projected/94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef-kube-api-access-2l564\") pod \"barbican-operator-controller-manager-59bc569d95-twhxx\" (UID: \"94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.037127 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbgp7\" (UniqueName: \"kubernetes.io/projected/9fe71acc-7d35-4d4b-ac69-e193d3f39028-kube-api-access-fbgp7\") pod \"cinder-operator-controller-manager-8d58dc466-dn22m\" (UID: \"9fe71acc-7d35-4d4b-ac69-e193d3f39028\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.042204 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.042986 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.047975 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-8v77x" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.067167 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.067893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.072419 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qrwxc" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.082031 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.094947 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.095936 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.103345 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rnw8r" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.104345 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.105213 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.107090 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vdlb4" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.116928 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.119843 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.143693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l564\" (UniqueName: \"kubernetes.io/projected/94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef-kube-api-access-2l564\") pod \"barbican-operator-controller-manager-59bc569d95-twhxx\" (UID: \"94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.143740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbgp7\" (UniqueName: \"kubernetes.io/projected/9fe71acc-7d35-4d4b-ac69-e193d3f39028-kube-api-access-fbgp7\") pod \"cinder-operator-controller-manager-8d58dc466-dn22m\" (UID: \"9fe71acc-7d35-4d4b-ac69-e193d3f39028\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.143773 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4js\" (UniqueName: \"kubernetes.io/projected/87dcea67-7f65-46a6-996b-3985bf1b5171-kube-api-access-sb4js\") pod \"designate-operator-controller-manager-588d4d986b-lxvtw\" (UID: \"87dcea67-7f65-46a6-996b-3985bf1b5171\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.143805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftw2s\" (UniqueName: \"kubernetes.io/projected/01ed348d-8a8a-4717-ba0d-1944b3f1c081-kube-api-access-ftw2s\") pod \"horizon-operator-controller-manager-8464cc45fb-7bdbc\" (UID: \"01ed348d-8a8a-4717-ba0d-1944b3f1c081\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.143826 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94bs\" (UniqueName: \"kubernetes.io/projected/0e83601c-758c-4f12-b745-bb68b0c4904f-kube-api-access-h94bs\") pod \"heat-operator-controller-manager-67dd5f86f5-m4gqz\" (UID: \"0e83601c-758c-4f12-b745-bb68b0c4904f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.143874 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77nmg\" (UniqueName: \"kubernetes.io/projected/80932361-6406-48dd-9e4b-4e9c27813f68-kube-api-access-77nmg\") pod \"glance-operator-controller-manager-79df6bcc97-lzvsq\" (UID: \"80932361-6406-48dd-9e4b-4e9c27813f68\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.144349 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.145056 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.148010 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.148231 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-fdr25" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.170370 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.171512 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.181879 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.182057 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7pzrk" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.183431 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l564\" (UniqueName: \"kubernetes.io/projected/94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef-kube-api-access-2l564\") pod \"barbican-operator-controller-manager-59bc569d95-twhxx\" (UID: \"94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.191682 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbgp7\" (UniqueName: \"kubernetes.io/projected/9fe71acc-7d35-4d4b-ac69-e193d3f39028-kube-api-access-fbgp7\") pod \"cinder-operator-controller-manager-8d58dc466-dn22m\" (UID: \"9fe71acc-7d35-4d4b-ac69-e193d3f39028\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.194166 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.200435 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.205231 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.205941 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.211742 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2vrnb" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.212048 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-jq88h"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.212763 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.221654 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-jq88h"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.223187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-b6stv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.242150 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.244837 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftw2s\" (UniqueName: \"kubernetes.io/projected/01ed348d-8a8a-4717-ba0d-1944b3f1c081-kube-api-access-ftw2s\") pod \"horizon-operator-controller-manager-8464cc45fb-7bdbc\" (UID: \"01ed348d-8a8a-4717-ba0d-1944b3f1c081\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.245010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h94bs\" (UniqueName: \"kubernetes.io/projected/0e83601c-758c-4f12-b745-bb68b0c4904f-kube-api-access-h94bs\") pod \"heat-operator-controller-manager-67dd5f86f5-m4gqz\" (UID: \"0e83601c-758c-4f12-b745-bb68b0c4904f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.247044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77nmg\" (UniqueName: \"kubernetes.io/projected/80932361-6406-48dd-9e4b-4e9c27813f68-kube-api-access-77nmg\") pod \"glance-operator-controller-manager-79df6bcc97-lzvsq\" (UID: \"80932361-6406-48dd-9e4b-4e9c27813f68\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.247164 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbqp8\" (UniqueName: \"kubernetes.io/projected/0a66456f-7860-4dc1-9c1c-0db69ddcc800-kube-api-access-gbqp8\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.247263 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8hqs\" (UniqueName: \"kubernetes.io/projected/20c78c73-daf5-481e-a4ac-62de73b5969e-kube-api-access-f8hqs\") pod \"manila-operator-controller-manager-55f864c847-jq88h\" (UID: \"20c78c73-daf5-481e-a4ac-62de73b5969e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.247381 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ql6\" (UniqueName: \"kubernetes.io/projected/49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd-kube-api-access-66ql6\") pod \"keystone-operator-controller-manager-768b96df4c-c7pjw\" (UID: \"49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.247493 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4js\" (UniqueName: \"kubernetes.io/projected/87dcea67-7f65-46a6-996b-3985bf1b5171-kube-api-access-sb4js\") pod \"designate-operator-controller-manager-588d4d986b-lxvtw\" (UID: \"87dcea67-7f65-46a6-996b-3985bf1b5171\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.247557 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.247578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpz8n\" (UniqueName: \"kubernetes.io/projected/f00c8c4b-874f-45ec-8a1a-e0834b3fc252-kube-api-access-gpz8n\") pod \"ironic-operator-controller-manager-6f787dddc9-9g8pv\" (UID: \"f00c8c4b-874f-45ec-8a1a-e0834b3fc252\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.249298 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.250312 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.252543 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-f97t2" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.256769 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.257729 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.264163 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.270506 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-rlj7t" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.279941 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.281325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftw2s\" (UniqueName: \"kubernetes.io/projected/01ed348d-8a8a-4717-ba0d-1944b3f1c081-kube-api-access-ftw2s\") pod \"horizon-operator-controller-manager-8464cc45fb-7bdbc\" (UID: \"01ed348d-8a8a-4717-ba0d-1944b3f1c081\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.288486 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77nmg\" (UniqueName: \"kubernetes.io/projected/80932361-6406-48dd-9e4b-4e9c27813f68-kube-api-access-77nmg\") pod \"glance-operator-controller-manager-79df6bcc97-lzvsq\" (UID: \"80932361-6406-48dd-9e4b-4e9c27813f68\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.304886 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94bs\" (UniqueName: \"kubernetes.io/projected/0e83601c-758c-4f12-b745-bb68b0c4904f-kube-api-access-h94bs\") pod \"heat-operator-controller-manager-67dd5f86f5-m4gqz\" (UID: \"0e83601c-758c-4f12-b745-bb68b0c4904f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.337857 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4js\" (UniqueName: \"kubernetes.io/projected/87dcea67-7f65-46a6-996b-3985bf1b5171-kube-api-access-sb4js\") pod \"designate-operator-controller-manager-588d4d986b-lxvtw\" (UID: \"87dcea67-7f65-46a6-996b-3985bf1b5171\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.338133 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.338303 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.339302 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.362881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.375840 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ccm9j" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.376493 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.376721 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbqp8\" (UniqueName: \"kubernetes.io/projected/0a66456f-7860-4dc1-9c1c-0db69ddcc800-kube-api-access-gbqp8\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.389413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8hqs\" (UniqueName: \"kubernetes.io/projected/20c78c73-daf5-481e-a4ac-62de73b5969e-kube-api-access-f8hqs\") pod \"manila-operator-controller-manager-55f864c847-jq88h\" (UID: \"20c78c73-daf5-481e-a4ac-62de73b5969e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.389513 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ql6\" (UniqueName: \"kubernetes.io/projected/49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd-kube-api-access-66ql6\") pod \"keystone-operator-controller-manager-768b96df4c-c7pjw\" (UID: \"49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.389561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.389589 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpz8n\" (UniqueName: \"kubernetes.io/projected/f00c8c4b-874f-45ec-8a1a-e0834b3fc252-kube-api-access-gpz8n\") pod \"ironic-operator-controller-manager-6f787dddc9-9g8pv\" (UID: \"f00c8c4b-874f-45ec-8a1a-e0834b3fc252\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" Mar 21 05:04:11 crc kubenswrapper[4775]: E0321 05:04:11.390722 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:11 crc kubenswrapper[4775]: E0321 05:04:11.390828 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert podName:0a66456f-7860-4dc1-9c1c-0db69ddcc800 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:11.890776544 +0000 UTC m=+1004.867240168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert") pod "infra-operator-controller-manager-65f65cc49c-2mgp8" (UID: "0a66456f-7860-4dc1-9c1c-0db69ddcc800") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.392581 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.423465 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ql6\" (UniqueName: \"kubernetes.io/projected/49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd-kube-api-access-66ql6\") pod \"keystone-operator-controller-manager-768b96df4c-c7pjw\" (UID: \"49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.423686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbqp8\" (UniqueName: \"kubernetes.io/projected/0a66456f-7860-4dc1-9c1c-0db69ddcc800-kube-api-access-gbqp8\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.423913 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.429913 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.436437 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.436638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpz8n\" (UniqueName: \"kubernetes.io/projected/f00c8c4b-874f-45ec-8a1a-e0834b3fc252-kube-api-access-gpz8n\") pod \"ironic-operator-controller-manager-6f787dddc9-9g8pv\" (UID: \"f00c8c4b-874f-45ec-8a1a-e0834b3fc252\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.436827 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8hqs\" (UniqueName: \"kubernetes.io/projected/20c78c73-daf5-481e-a4ac-62de73b5969e-kube-api-access-f8hqs\") pod \"manila-operator-controller-manager-55f864c847-jq88h\" (UID: \"20c78c73-daf5-481e-a4ac-62de73b5969e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.437146 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.440923 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hvcgr" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.471313 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.494636 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.504095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zv76\" (UniqueName: \"kubernetes.io/projected/9bfe7d25-53ea-484d-a481-0ea04ee2b8a8-kube-api-access-8zv76\") pod \"neutron-operator-controller-manager-767865f676-4tbvg\" (UID: \"9bfe7d25-53ea-484d-a481-0ea04ee2b8a8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.504221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdjp\" (UniqueName: \"kubernetes.io/projected/02b6af47-2c06-480b-a838-2d742efa1045-kube-api-access-bgdjp\") pod \"nova-operator-controller-manager-5d488d59fb-tss7r\" (UID: \"02b6af47-2c06-480b-a838-2d742efa1045\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.504271 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84jdm\" (UniqueName: \"kubernetes.io/projected/ec38d53e-6fe4-41b5-8548-e49fadd9d6bf-kube-api-access-84jdm\") pod \"mariadb-operator-controller-manager-67ccfc9778-hf688\" (UID: \"ec38d53e-6fe4-41b5-8548-e49fadd9d6bf\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.504315 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tstv\" (UniqueName: \"kubernetes.io/projected/745c79b1-1bcf-4c0f-82ee-a26cbba46d48-kube-api-access-4tstv\") pod \"octavia-operator-controller-manager-5b9f45d989-67scw\" (UID: \"745c79b1-1bcf-4c0f-82ee-a26cbba46d48\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.525676 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.526462 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.528838 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cthp5" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.529066 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.533465 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.534685 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.537112 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qq45w" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.544382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.569379 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.576434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.576925 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.578804 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.582836 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sdwdh" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.588421 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.595239 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605100 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zv76\" (UniqueName: \"kubernetes.io/projected/9bfe7d25-53ea-484d-a481-0ea04ee2b8a8-kube-api-access-8zv76\") pod \"neutron-operator-controller-manager-767865f676-4tbvg\" (UID: \"9bfe7d25-53ea-484d-a481-0ea04ee2b8a8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxgwb\" (UniqueName: \"kubernetes.io/projected/898b32c5-9f21-4fba-90c5-a333f36addf2-kube-api-access-cxgwb\") pod \"placement-operator-controller-manager-5784578c99-jj4pt\" (UID: \"898b32c5-9f21-4fba-90c5-a333f36addf2\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605215 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdjp\" (UniqueName: \"kubernetes.io/projected/02b6af47-2c06-480b-a838-2d742efa1045-kube-api-access-bgdjp\") pod \"nova-operator-controller-manager-5d488d59fb-tss7r\" (UID: \"02b6af47-2c06-480b-a838-2d742efa1045\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605241 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrnmj\" (UniqueName: \"kubernetes.io/projected/3035739a-202f-4794-bb4f-ae2342a96441-kube-api-access-jrnmj\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605271 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605291 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jdm\" (UniqueName: \"kubernetes.io/projected/ec38d53e-6fe4-41b5-8548-e49fadd9d6bf-kube-api-access-84jdm\") pod \"mariadb-operator-controller-manager-67ccfc9778-hf688\" (UID: \"ec38d53e-6fe4-41b5-8548-e49fadd9d6bf\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tstv\" (UniqueName: \"kubernetes.io/projected/745c79b1-1bcf-4c0f-82ee-a26cbba46d48-kube-api-access-4tstv\") pod \"octavia-operator-controller-manager-5b9f45d989-67scw\" (UID: \"745c79b1-1bcf-4c0f-82ee-a26cbba46d48\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.605337 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7zpk\" (UniqueName: \"kubernetes.io/projected/6eeb04ad-7251-488c-bd52-b2f14f6fb68b-kube-api-access-x7zpk\") pod \"ovn-operator-controller-manager-884679f54-ctm9h\" (UID: \"6eeb04ad-7251-488c-bd52-b2f14f6fb68b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.627569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdjp\" (UniqueName: \"kubernetes.io/projected/02b6af47-2c06-480b-a838-2d742efa1045-kube-api-access-bgdjp\") pod \"nova-operator-controller-manager-5d488d59fb-tss7r\" (UID: \"02b6af47-2c06-480b-a838-2d742efa1045\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.627744 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.629066 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.631865 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8mtqb" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.631903 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84jdm\" (UniqueName: \"kubernetes.io/projected/ec38d53e-6fe4-41b5-8548-e49fadd9d6bf-kube-api-access-84jdm\") pod \"mariadb-operator-controller-manager-67ccfc9778-hf688\" (UID: \"ec38d53e-6fe4-41b5-8548-e49fadd9d6bf\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.633659 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zv76\" (UniqueName: \"kubernetes.io/projected/9bfe7d25-53ea-484d-a481-0ea04ee2b8a8-kube-api-access-8zv76\") pod \"neutron-operator-controller-manager-767865f676-4tbvg\" (UID: \"9bfe7d25-53ea-484d-a481-0ea04ee2b8a8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.633721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tstv\" (UniqueName: \"kubernetes.io/projected/745c79b1-1bcf-4c0f-82ee-a26cbba46d48-kube-api-access-4tstv\") pod \"octavia-operator-controller-manager-5b9f45d989-67scw\" (UID: \"745c79b1-1bcf-4c0f-82ee-a26cbba46d48\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.647971 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.702670 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f51c821-d0a6-4b50-886c-60b46cef4f4e" path="/var/lib/kubelet/pods/0f51c821-d0a6-4b50-886c-60b46cef4f4e/volumes" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.704504 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.708170 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.708923 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.714210 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.715599 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-czmtg" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.716078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7zpk\" (UniqueName: \"kubernetes.io/projected/6eeb04ad-7251-488c-bd52-b2f14f6fb68b-kube-api-access-x7zpk\") pod \"ovn-operator-controller-manager-884679f54-ctm9h\" (UID: \"6eeb04ad-7251-488c-bd52-b2f14f6fb68b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.716150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8sv\" (UniqueName: \"kubernetes.io/projected/5968f1d9-f4e0-4c67-923e-2494e15c4088-kube-api-access-6g8sv\") pod \"swift-operator-controller-manager-c674c5965-lqmgv\" (UID: \"5968f1d9-f4e0-4c67-923e-2494e15c4088\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.718698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxgwb\" (UniqueName: \"kubernetes.io/projected/898b32c5-9f21-4fba-90c5-a333f36addf2-kube-api-access-cxgwb\") pod \"placement-operator-controller-manager-5784578c99-jj4pt\" (UID: \"898b32c5-9f21-4fba-90c5-a333f36addf2\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.721555 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.722413 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.728055 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9psgs" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.740804 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrnmj\" (UniqueName: \"kubernetes.io/projected/3035739a-202f-4794-bb4f-ae2342a96441-kube-api-access-jrnmj\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.740905 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:11 crc kubenswrapper[4775]: E0321 05:04:11.741144 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:11 crc kubenswrapper[4775]: E0321 05:04:11.741203 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert podName:3035739a-202f-4794-bb4f-ae2342a96441 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:12.241179823 +0000 UTC m=+1005.217643447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rwq59" (UID: "3035739a-202f-4794-bb4f-ae2342a96441") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.745140 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.756821 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.771218 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.773726 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7zpk\" (UniqueName: \"kubernetes.io/projected/6eeb04ad-7251-488c-bd52-b2f14f6fb68b-kube-api-access-x7zpk\") pod \"ovn-operator-controller-manager-884679f54-ctm9h\" (UID: \"6eeb04ad-7251-488c-bd52-b2f14f6fb68b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.774637 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxgwb\" (UniqueName: \"kubernetes.io/projected/898b32c5-9f21-4fba-90c5-a333f36addf2-kube-api-access-cxgwb\") pod \"placement-operator-controller-manager-5784578c99-jj4pt\" (UID: \"898b32c5-9f21-4fba-90c5-a333f36addf2\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.794176 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.787430 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrnmj\" (UniqueName: \"kubernetes.io/projected/3035739a-202f-4794-bb4f-ae2342a96441-kube-api-access-jrnmj\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.817526 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.837375 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.838473 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.842534 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-tf6bp" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.843224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thj25\" (UniqueName: \"kubernetes.io/projected/d1dbd80a-0782-4035-a263-b52a90f6ee0e-kube-api-access-thj25\") pod \"telemetry-operator-controller-manager-d6b694c5-9s82h\" (UID: \"d1dbd80a-0782-4035-a263-b52a90f6ee0e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.843335 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdnx\" (UniqueName: \"kubernetes.io/projected/9cac78ed-6325-4649-bb05-a1518ae692e9-kube-api-access-4tdnx\") pod \"test-operator-controller-manager-5c5cb9c4d7-l59gx\" (UID: \"9cac78ed-6325-4649-bb05-a1518ae692e9\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.843373 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8sv\" (UniqueName: \"kubernetes.io/projected/5968f1d9-f4e0-4c67-923e-2494e15c4088-kube-api-access-6g8sv\") pod \"swift-operator-controller-manager-c674c5965-lqmgv\" (UID: \"5968f1d9-f4e0-4c67-923e-2494e15c4088\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.850667 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.881411 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.890008 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8sv\" (UniqueName: \"kubernetes.io/projected/5968f1d9-f4e0-4c67-923e-2494e15c4088-kube-api-access-6g8sv\") pod \"swift-operator-controller-manager-c674c5965-lqmgv\" (UID: \"5968f1d9-f4e0-4c67-923e-2494e15c4088\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.902900 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq"] Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.904398 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.911158 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.912201 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.912296 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zqnnz" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.938053 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.945101 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.945176 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.945240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thj25\" (UniqueName: \"kubernetes.io/projected/d1dbd80a-0782-4035-a263-b52a90f6ee0e-kube-api-access-thj25\") pod \"telemetry-operator-controller-manager-d6b694c5-9s82h\" (UID: \"d1dbd80a-0782-4035-a263-b52a90f6ee0e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.945316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdnx\" (UniqueName: \"kubernetes.io/projected/9cac78ed-6325-4649-bb05-a1518ae692e9-kube-api-access-4tdnx\") pod \"test-operator-controller-manager-5c5cb9c4d7-l59gx\" (UID: \"9cac78ed-6325-4649-bb05-a1518ae692e9\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.945381 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg86m\" (UniqueName: \"kubernetes.io/projected/907f0cdf-2d87-4d09-97af-5591d061b4f6-kube-api-access-cg86m\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.945409 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:11 crc kubenswrapper[4775]: I0321 05:04:11.945429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvqc\" (UniqueName: \"kubernetes.io/projected/3c9f18bd-def6-45ff-a92b-25c6f40d6bb5-kube-api-access-clvqc\") pod \"watcher-operator-controller-manager-6c4d75f7f9-gdrc5\" (UID: \"3c9f18bd-def6-45ff-a92b-25c6f40d6bb5\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" Mar 21 05:04:11 crc kubenswrapper[4775]: E0321 05:04:11.945606 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:11 crc kubenswrapper[4775]: E0321 05:04:11.945647 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert podName:0a66456f-7860-4dc1-9c1c-0db69ddcc800 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:12.945633088 +0000 UTC m=+1005.922096712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert") pod "infra-operator-controller-manager-65f65cc49c-2mgp8" (UID: "0a66456f-7860-4dc1-9c1c-0db69ddcc800") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.022630 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.042875 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.047266 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.048065 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg86m\" (UniqueName: \"kubernetes.io/projected/907f0cdf-2d87-4d09-97af-5591d061b4f6-kube-api-access-cg86m\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.049245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.049381 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvqc\" (UniqueName: \"kubernetes.io/projected/3c9f18bd-def6-45ff-a92b-25c6f40d6bb5-kube-api-access-clvqc\") pod \"watcher-operator-controller-manager-6c4d75f7f9-gdrc5\" (UID: \"3c9f18bd-def6-45ff-a92b-25c6f40d6bb5\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.049453 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.049637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.057290 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.057619 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:12.55758417 +0000 UTC m=+1005.534047794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.057839 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-2ksnd" Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.057953 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.058072 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:12.558058524 +0000 UTC m=+1005.534522148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "metrics-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.057439 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thj25\" (UniqueName: \"kubernetes.io/projected/d1dbd80a-0782-4035-a263-b52a90f6ee0e-kube-api-access-thj25\") pod \"telemetry-operator-controller-manager-d6b694c5-9s82h\" (UID: \"d1dbd80a-0782-4035-a263-b52a90f6ee0e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.077562 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.081939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdnx\" (UniqueName: \"kubernetes.io/projected/9cac78ed-6325-4649-bb05-a1518ae692e9-kube-api-access-4tdnx\") pod \"test-operator-controller-manager-5c5cb9c4d7-l59gx\" (UID: \"9cac78ed-6325-4649-bb05-a1518ae692e9\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.093952 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.125148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvqc\" (UniqueName: \"kubernetes.io/projected/3c9f18bd-def6-45ff-a92b-25c6f40d6bb5-kube-api-access-clvqc\") pod \"watcher-operator-controller-manager-6c4d75f7f9-gdrc5\" (UID: \"3c9f18bd-def6-45ff-a92b-25c6f40d6bb5\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.141774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg86m\" (UniqueName: \"kubernetes.io/projected/907f0cdf-2d87-4d09-97af-5591d061b4f6-kube-api-access-cg86m\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.184834 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.204266 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.211052 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6nhhl"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.223198 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.238541 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.248948 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.259870 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.259976 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdx89\" (UniqueName: \"kubernetes.io/projected/95a44b12-e027-400d-b257-99f2012251d8-kube-api-access-kdx89\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jvglf\" (UID: \"95a44b12-e027-400d-b257-99f2012251d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.260173 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.260213 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert podName:3035739a-202f-4794-bb4f-ae2342a96441 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:13.260199123 +0000 UTC m=+1006.236662747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rwq59" (UID: "3035739a-202f-4794-bb4f-ae2342a96441") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.363438 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdx89\" (UniqueName: \"kubernetes.io/projected/95a44b12-e027-400d-b257-99f2012251d8-kube-api-access-kdx89\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jvglf\" (UID: \"95a44b12-e027-400d-b257-99f2012251d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.386302 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdx89\" (UniqueName: \"kubernetes.io/projected/95a44b12-e027-400d-b257-99f2012251d8-kube-api-access-kdx89\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jvglf\" (UID: \"95a44b12-e027-400d-b257-99f2012251d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.431215 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.455527 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.523812 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.565856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.565967 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.565993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.566034 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:13.56601363 +0000 UTC m=+1006.542477254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.566100 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.566151 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:13.566139723 +0000 UTC m=+1006.542603347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "metrics-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.574193 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc"] Mar 21 05:04:12 crc kubenswrapper[4775]: W0321 05:04:12.611806 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ed348d_8a8a_4717_ba0d_1944b3f1c081.slice/crio-370ea74a650349df6fa12d38039e6e6c3bd1b1f3b383cb055129fd355e72f1c6 WatchSource:0}: Error finding container 370ea74a650349df6fa12d38039e6e6c3bd1b1f3b383cb055129fd355e72f1c6: Status 404 returned error can't find the container with id 370ea74a650349df6fa12d38039e6e6c3bd1b1f3b383cb055129fd355e72f1c6 Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.902386 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.909555 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw"] Mar 21 05:04:12 crc kubenswrapper[4775]: W0321 05:04:12.922292 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf00c8c4b_874f_45ec_8a1a_e0834b3fc252.slice/crio-9952a05dffed9417a338c1437660f1e8c0b0b0d0d8381950453c223b515205a5 WatchSource:0}: Error finding container 9952a05dffed9417a338c1437660f1e8c0b0b0d0d8381950453c223b515205a5: Status 404 returned error can't find the container with id 9952a05dffed9417a338c1437660f1e8c0b0b0d0d8381950453c223b515205a5 Mar 21 05:04:12 crc kubenswrapper[4775]: W0321 05:04:12.924395 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a3c6c7_9e86_495d_8d1a_486d6bfbbbdd.slice/crio-0e5d6af419e11ff092281a5770d12a7c04f50f7843675dbc29cbb9a75a9d66fb WatchSource:0}: Error finding container 0e5d6af419e11ff092281a5770d12a7c04f50f7843675dbc29cbb9a75a9d66fb: Status 404 returned error can't find the container with id 0e5d6af419e11ff092281a5770d12a7c04f50f7843675dbc29cbb9a75a9d66fb Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.925532 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-jq88h"] Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.929687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" event={"ID":"94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef","Type":"ContainerStarted","Data":"9463c706f5f47e68a62ef68715be3e8c7f5a1b32aadc69665b5db9b96a7ce7bd"} Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.934150 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" event={"ID":"0e83601c-758c-4f12-b745-bb68b0c4904f","Type":"ContainerStarted","Data":"7c70d5135af72bc479363a2adcde7e746fa83857807b8d04642a6365ca7ae8f5"} Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.936185 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" event={"ID":"80932361-6406-48dd-9e4b-4e9c27813f68","Type":"ContainerStarted","Data":"d9a8ee0cbb4e1d1ae9a8e8706a43da215a4d62cb7e6828d519feacdd63413f2e"} Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.937987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" event={"ID":"9fe71acc-7d35-4d4b-ac69-e193d3f39028","Type":"ContainerStarted","Data":"f192a51da0996ede725942f75554886d6952f4b898a6ee74b81f602102e264c8"} Mar 21 05:04:12 crc kubenswrapper[4775]: W0321 05:04:12.938262 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c78c73_daf5_481e_a4ac_62de73b5969e.slice/crio-93ad1f6e6168188143481da9a75415dc36e74d1bf96d6519927b1c6f285c4852 WatchSource:0}: Error finding container 93ad1f6e6168188143481da9a75415dc36e74d1bf96d6519927b1c6f285c4852: Status 404 returned error can't find the container with id 93ad1f6e6168188143481da9a75415dc36e74d1bf96d6519927b1c6f285c4852 Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.943866 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" event={"ID":"87dcea67-7f65-46a6-996b-3985bf1b5171","Type":"ContainerStarted","Data":"f8bf0412db315132c2c4a3b4becf65c579aea5c697c3bbf4a646c390dab460a7"} Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.946691 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6nhhl" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="registry-server" containerID="cri-o://bfa25ad839c3791aa796fdbb1ca916c818ffc84d5a2bd52f3d19c4a9886bd0e1" gracePeriod=2 Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.946773 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" event={"ID":"01ed348d-8a8a-4717-ba0d-1944b3f1c081","Type":"ContainerStarted","Data":"370ea74a650349df6fa12d38039e6e6c3bd1b1f3b383cb055129fd355e72f1c6"} Mar 21 05:04:12 crc kubenswrapper[4775]: I0321 05:04:12.971873 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.972100 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:12 crc kubenswrapper[4775]: E0321 05:04:12.972179 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert podName:0a66456f-7860-4dc1-9c1c-0db69ddcc800 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:14.97215903 +0000 UTC m=+1007.948622654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert") pod "infra-operator-controller-manager-65f65cc49c-2mgp8" (UID: "0a66456f-7860-4dc1-9c1c-0db69ddcc800") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.229576 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.235486 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.256739 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.262755 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.277460 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.277938 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.277994 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert podName:3035739a-202f-4794-bb4f-ae2342a96441 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:15.277976727 +0000 UTC m=+1008.254440351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rwq59" (UID: "3035739a-202f-4794-bb4f-ae2342a96441") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.278402 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.286194 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.299312 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.305224 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.312351 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.321211 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5"] Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.324358 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf"] Mar 21 05:04:13 crc kubenswrapper[4775]: W0321 05:04:13.350896 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c9f18bd_def6_45ff_a92b_25c6f40d6bb5.slice/crio-ad1a20a2a60b39e31bf0dc14b2a2a50fe4d299b742bce9bbf2eed2a2b544d612 WatchSource:0}: Error finding container ad1a20a2a60b39e31bf0dc14b2a2a50fe4d299b742bce9bbf2eed2a2b544d612: Status 404 returned error can't find the container with id ad1a20a2a60b39e31bf0dc14b2a2a50fe4d299b742bce9bbf2eed2a2b544d612 Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.351485 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdx89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-jvglf_openstack-operators(95a44b12-e027-400d-b257-99f2012251d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.351603 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x7zpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-ctm9h_openstack-operators(6eeb04ad-7251-488c-bd52-b2f14f6fb68b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.351629 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zv76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-4tbvg_openstack-operators(9bfe7d25-53ea-484d-a481-0ea04ee2b8a8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.351747 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84jdm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-hf688_openstack-operators(ec38d53e-6fe4-41b5-8548-e49fadd9d6bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.353158 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" podUID="9bfe7d25-53ea-484d-a481-0ea04ee2b8a8" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.353203 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" podUID="ec38d53e-6fe4-41b5-8548-e49fadd9d6bf" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.353225 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" podUID="6eeb04ad-7251-488c-bd52-b2f14f6fb68b" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.353244 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" podUID="95a44b12-e027-400d-b257-99f2012251d8" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.368329 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-clvqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-gdrc5_openstack-operators(3c9f18bd-def6-45ff-a92b-25c6f40d6bb5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.369498 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" podUID="3c9f18bd-def6-45ff-a92b-25c6f40d6bb5" Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.581022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.581127 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.581223 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.581281 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.581350 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:15.581325084 +0000 UTC m=+1008.557788708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "metrics-server-cert" not found Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.581370 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:15.581362275 +0000 UTC m=+1008.557825899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "webhook-server-cert" not found Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.955176 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" event={"ID":"5968f1d9-f4e0-4c67-923e-2494e15c4088","Type":"ContainerStarted","Data":"c22a3b421b20aebd6844914d88ac1b1eca165d2c5ff23665d7ffe3ed50be4b1e"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.958272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" event={"ID":"02b6af47-2c06-480b-a838-2d742efa1045","Type":"ContainerStarted","Data":"7906999cc16e82fcb0dcd2e5020ffe06fa838cf9857fb99542fdcc09100516ff"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.959260 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" event={"ID":"745c79b1-1bcf-4c0f-82ee-a26cbba46d48","Type":"ContainerStarted","Data":"95e47c4facffa5845ce3352cfbc986a6018f2c261d3b23fb69c8a344f68ab3c9"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.961780 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" event={"ID":"ec38d53e-6fe4-41b5-8548-e49fadd9d6bf","Type":"ContainerStarted","Data":"543341eb244d971192be7aebcdafb74132e2b5df8f2c5068cf04dab2caab621a"} Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.964249 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" podUID="ec38d53e-6fe4-41b5-8548-e49fadd9d6bf" Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.966900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" event={"ID":"898b32c5-9f21-4fba-90c5-a333f36addf2","Type":"ContainerStarted","Data":"dc24878786da3f3dbe3c3011dfcb502d4c5a8dfec748074f92c806f91a7029a0"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.969138 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" event={"ID":"d1dbd80a-0782-4035-a263-b52a90f6ee0e","Type":"ContainerStarted","Data":"018b768dcada9e8a05496b77c87fd5746e1d3d786081d4cceed611a15644df26"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.971030 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" event={"ID":"f00c8c4b-874f-45ec-8a1a-e0834b3fc252","Type":"ContainerStarted","Data":"9952a05dffed9417a338c1437660f1e8c0b0b0d0d8381950453c223b515205a5"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.972145 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" event={"ID":"95a44b12-e027-400d-b257-99f2012251d8","Type":"ContainerStarted","Data":"806e83334c792a8b0305a7f67d211b4f3a82d2f33b6b96148798e5a534611c0d"} Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.973545 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" podUID="95a44b12-e027-400d-b257-99f2012251d8" Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.974607 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" event={"ID":"20c78c73-daf5-481e-a4ac-62de73b5969e","Type":"ContainerStarted","Data":"93ad1f6e6168188143481da9a75415dc36e74d1bf96d6519927b1c6f285c4852"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.975776 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" event={"ID":"49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd","Type":"ContainerStarted","Data":"0e5d6af419e11ff092281a5770d12a7c04f50f7843675dbc29cbb9a75a9d66fb"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.976776 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" event={"ID":"9cac78ed-6325-4649-bb05-a1518ae692e9","Type":"ContainerStarted","Data":"d33f5adf3ebacaf877df27a18755fffa61d1f5869b445de66666e8c34353580e"} Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.977622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" event={"ID":"9bfe7d25-53ea-484d-a481-0ea04ee2b8a8","Type":"ContainerStarted","Data":"e28c181cd4648baa68bcef5fa441b083107a741b5c2a8f3ec951c960aabffcf3"} Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.979406 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" podUID="9bfe7d25-53ea-484d-a481-0ea04ee2b8a8" Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.981192 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" event={"ID":"3c9f18bd-def6-45ff-a92b-25c6f40d6bb5","Type":"ContainerStarted","Data":"ad1a20a2a60b39e31bf0dc14b2a2a50fe4d299b742bce9bbf2eed2a2b544d612"} Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.982325 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" podUID="3c9f18bd-def6-45ff-a92b-25c6f40d6bb5" Mar 21 05:04:13 crc kubenswrapper[4775]: I0321 05:04:13.982876 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" event={"ID":"6eeb04ad-7251-488c-bd52-b2f14f6fb68b","Type":"ContainerStarted","Data":"148ea4025f8ef65bc1a19cec65c2ba6919e2bb2969fbd9638161e6500d07dc60"} Mar 21 05:04:13 crc kubenswrapper[4775]: E0321 05:04:13.984448 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" podUID="6eeb04ad-7251-488c-bd52-b2f14f6fb68b" Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.003864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.004025 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.004352 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert podName:0a66456f-7860-4dc1-9c1c-0db69ddcc800 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:19.004327253 +0000 UTC m=+1011.980790867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert") pod "infra-operator-controller-manager-65f65cc49c-2mgp8" (UID: "0a66456f-7860-4dc1-9c1c-0db69ddcc800") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.018408 4775 generic.go:334] "Generic (PLEG): container finished" podID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerID="bfa25ad839c3791aa796fdbb1ca916c818ffc84d5a2bd52f3d19c4a9886bd0e1" exitCode=0 Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.018489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerDied","Data":"bfa25ad839c3791aa796fdbb1ca916c818ffc84d5a2bd52f3d19c4a9886bd0e1"} Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.023991 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" podUID="ec38d53e-6fe4-41b5-8548-e49fadd9d6bf" Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.024059 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" podUID="3c9f18bd-def6-45ff-a92b-25c6f40d6bb5" Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.024284 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" podUID="6eeb04ad-7251-488c-bd52-b2f14f6fb68b" Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.024334 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" podUID="9bfe7d25-53ea-484d-a481-0ea04ee2b8a8" Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.026340 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" podUID="95a44b12-e027-400d-b257-99f2012251d8" Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.309887 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.310130 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.310188 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert podName:3035739a-202f-4794-bb4f-ae2342a96441 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:19.31017322 +0000 UTC m=+1012.286636844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rwq59" (UID: "3035739a-202f-4794-bb4f-ae2342a96441") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.616362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.616481 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.616531 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.616677 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.616713 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:19.616691197 +0000 UTC m=+1012.593154821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "metrics-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: E0321 05:04:15.616734 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:19.616726068 +0000 UTC m=+1012.593189692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "webhook-server-cert" not found Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.804424 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.919803 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-utilities\") pod \"694f1c66-ce56-4854-9f4c-02a42e16046d\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.919940 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6ln\" (UniqueName: \"kubernetes.io/projected/694f1c66-ce56-4854-9f4c-02a42e16046d-kube-api-access-xg6ln\") pod \"694f1c66-ce56-4854-9f4c-02a42e16046d\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.919964 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-catalog-content\") pod \"694f1c66-ce56-4854-9f4c-02a42e16046d\" (UID: \"694f1c66-ce56-4854-9f4c-02a42e16046d\") " Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.920728 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-utilities" (OuterVolumeSpecName: "utilities") pod "694f1c66-ce56-4854-9f4c-02a42e16046d" (UID: "694f1c66-ce56-4854-9f4c-02a42e16046d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.931180 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694f1c66-ce56-4854-9f4c-02a42e16046d-kube-api-access-xg6ln" (OuterVolumeSpecName: "kube-api-access-xg6ln") pod "694f1c66-ce56-4854-9f4c-02a42e16046d" (UID: "694f1c66-ce56-4854-9f4c-02a42e16046d"). InnerVolumeSpecName "kube-api-access-xg6ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:04:15 crc kubenswrapper[4775]: I0321 05:04:15.987401 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "694f1c66-ce56-4854-9f4c-02a42e16046d" (UID: "694f1c66-ce56-4854-9f4c-02a42e16046d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.021670 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.021706 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6ln\" (UniqueName: \"kubernetes.io/projected/694f1c66-ce56-4854-9f4c-02a42e16046d-kube-api-access-xg6ln\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.021717 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/694f1c66-ce56-4854-9f4c-02a42e16046d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.032433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nhhl" event={"ID":"694f1c66-ce56-4854-9f4c-02a42e16046d","Type":"ContainerDied","Data":"3f6120ae96f0b10b5bf76bd3efa458fabb2c0680a195b83d1f67570f61231bb6"} Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.032482 4775 scope.go:117] "RemoveContainer" containerID="bfa25ad839c3791aa796fdbb1ca916c818ffc84d5a2bd52f3d19c4a9886bd0e1" Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.032498 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nhhl" Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.067174 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6nhhl"] Mar 21 05:04:16 crc kubenswrapper[4775]: I0321 05:04:16.071539 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6nhhl"] Mar 21 05:04:17 crc kubenswrapper[4775]: I0321 05:04:17.674009 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" path="/var/lib/kubelet/pods/694f1c66-ce56-4854-9f4c-02a42e16046d/volumes" Mar 21 05:04:19 crc kubenswrapper[4775]: I0321 05:04:19.069971 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.070269 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.070845 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert podName:0a66456f-7860-4dc1-9c1c-0db69ddcc800 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:27.070820738 +0000 UTC m=+1020.047284362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert") pod "infra-operator-controller-manager-65f65cc49c-2mgp8" (UID: "0a66456f-7860-4dc1-9c1c-0db69ddcc800") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:19 crc kubenswrapper[4775]: I0321 05:04:19.374776 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.375002 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.375131 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert podName:3035739a-202f-4794-bb4f-ae2342a96441 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:27.375091901 +0000 UTC m=+1020.351555535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rwq59" (UID: "3035739a-202f-4794-bb4f-ae2342a96441") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:19 crc kubenswrapper[4775]: I0321 05:04:19.679842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:19 crc kubenswrapper[4775]: I0321 05:04:19.680004 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.680180 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.680218 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.680303 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:27.68027678 +0000 UTC m=+1020.656740404 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "metrics-server-cert" not found Mar 21 05:04:19 crc kubenswrapper[4775]: E0321 05:04:19.680329 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:27.680317782 +0000 UTC m=+1020.656781406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "webhook-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: I0321 05:04:27.095002 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.095190 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.095684 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert podName:0a66456f-7860-4dc1-9c1c-0db69ddcc800 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:43.095668035 +0000 UTC m=+1036.072131659 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert") pod "infra-operator-controller-manager-65f65cc49c-2mgp8" (UID: "0a66456f-7860-4dc1-9c1c-0db69ddcc800") : secret "infra-operator-webhook-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: I0321 05:04:27.289805 4775 scope.go:117] "RemoveContainer" containerID="428647d2139b15f3ab85f6734bc25c0214e38d663b4a1c7cb9b69a834fdbe970" Mar 21 05:04:27 crc kubenswrapper[4775]: I0321 05:04:27.400441 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.400707 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.400900 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert podName:3035739a-202f-4794-bb4f-ae2342a96441 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:43.400878605 +0000 UTC m=+1036.377342239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-rwq59" (UID: "3035739a-202f-4794-bb4f-ae2342a96441") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: I0321 05:04:27.706651 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:27 crc kubenswrapper[4775]: I0321 05:04:27.706782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.706938 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.707156 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:43.707134294 +0000 UTC m=+1036.683597938 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "webhook-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.707353 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.707454 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs podName:907f0cdf-2d87-4d09-97af-5591d061b4f6 nodeName:}" failed. No retries permitted until 2026-03-21 05:04:43.707396922 +0000 UTC m=+1036.683860626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs") pod "openstack-operator-controller-manager-65746ff4dc-hg4rq" (UID: "907f0cdf-2d87-4d09-97af-5591d061b4f6") : secret "metrics-server-cert" not found Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.877073 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.877597 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-66ql6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-c7pjw_openstack-operators(49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:04:27 crc kubenswrapper[4775]: E0321 05:04:27.879336 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" podUID="49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd" Mar 21 05:04:28 crc kubenswrapper[4775]: E0321 05:04:28.127348 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" podUID="49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd" Mar 21 05:04:28 crc kubenswrapper[4775]: I0321 05:04:28.497386 4775 scope.go:117] "RemoveContainer" containerID="5780c596a9174ce034338976560a433a9f2ac7cb3aacc03fbb37bc5801d6945f" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.163361 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" event={"ID":"5968f1d9-f4e0-4c67-923e-2494e15c4088","Type":"ContainerStarted","Data":"70696836ff0e4143f237059b34fd8d37fdd8ca4839c9b8f25d96699ea9ab6f17"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.164614 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.167490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" event={"ID":"9fe71acc-7d35-4d4b-ac69-e193d3f39028","Type":"ContainerStarted","Data":"d86b0eb4fc5ac0e1148a2e480f6a5150f677dc209391e3444eab379696449830"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.167617 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.174272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" event={"ID":"94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef","Type":"ContainerStarted","Data":"ee7de642c14e7c121b797dcc7b48f25edaca96fc418edb9a96735178a2faf5cb"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.174700 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.183915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" event={"ID":"898b32c5-9f21-4fba-90c5-a333f36addf2","Type":"ContainerStarted","Data":"1a40b0f2ff98417dd04737fdafd26d4727fd546a6a1a9fe86114fe4c03484219"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.184631 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.198402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" event={"ID":"0e83601c-758c-4f12-b745-bb68b0c4904f","Type":"ContainerStarted","Data":"4db7d98f3b32e2b010f8cbc92ce1f2cd269ba478ee31833e3ae2d5dd0db7771c"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.199145 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.218970 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" event={"ID":"f00c8c4b-874f-45ec-8a1a-e0834b3fc252","Type":"ContainerStarted","Data":"95e910eb8aaa7fe737248a37785fec97e0489fab14a673d21fae213a2802dd26"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.219843 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.222431 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" podStartSLOduration=2.959640905 podStartE2EDuration="19.222418288s" podCreationTimestamp="2026-03-21 05:04:10 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.212263484 +0000 UTC m=+1005.188727098" lastFinishedPulling="2026-03-21 05:04:28.475040857 +0000 UTC m=+1021.451504481" observedRunningTime="2026-03-21 05:04:29.22036637 +0000 UTC m=+1022.196829994" watchObservedRunningTime="2026-03-21 05:04:29.222418288 +0000 UTC m=+1022.198881912" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.235518 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" podStartSLOduration=3.076224519 podStartE2EDuration="18.235501189s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.335540198 +0000 UTC m=+1006.312003812" lastFinishedPulling="2026-03-21 05:04:28.494816858 +0000 UTC m=+1021.471280482" observedRunningTime="2026-03-21 05:04:29.1928296 +0000 UTC m=+1022.169293224" watchObservedRunningTime="2026-03-21 05:04:29.235501189 +0000 UTC m=+1022.211964813" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.248363 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" podStartSLOduration=3.081477979 podStartE2EDuration="18.248347063s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.330581438 +0000 UTC m=+1006.307045062" lastFinishedPulling="2026-03-21 05:04:28.497450512 +0000 UTC m=+1021.473914146" observedRunningTime="2026-03-21 05:04:29.24646943 +0000 UTC m=+1022.222933054" watchObservedRunningTime="2026-03-21 05:04:29.248347063 +0000 UTC m=+1022.224810687" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.261956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" event={"ID":"745c79b1-1bcf-4c0f-82ee-a26cbba46d48","Type":"ContainerStarted","Data":"d6dacc08d0b5fc70453c979724b352358b60f46a379573af9e32ec3ebcfecca5"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.262107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.277237 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" event={"ID":"20c78c73-daf5-481e-a4ac-62de73b5969e","Type":"ContainerStarted","Data":"f38e961c3dacd609f00cdde887590e2509ab4b7c8f5532c2b8f9b831d7b14589"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.278450 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.281695 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" podStartSLOduration=5.52008238 podStartE2EDuration="19.281679648s" podCreationTimestamp="2026-03-21 05:04:10 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.232270521 +0000 UTC m=+1005.208734145" lastFinishedPulling="2026-03-21 05:04:25.993867769 +0000 UTC m=+1018.970331413" observedRunningTime="2026-03-21 05:04:29.277287513 +0000 UTC m=+1022.253751137" watchObservedRunningTime="2026-03-21 05:04:29.281679648 +0000 UTC m=+1022.258143272" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.290725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" event={"ID":"01ed348d-8a8a-4717-ba0d-1944b3f1c081","Type":"ContainerStarted","Data":"ef3f8a1625951c2579ed0041d2865d7f33f87f1596a3a6638ec0d1956bd20e81"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.290888 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.292528 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" event={"ID":"80932361-6406-48dd-9e4b-4e9c27813f68","Type":"ContainerStarted","Data":"0b58b73bdee75df59b7a689dec683772bc72f71c86539f887e29004c32c54513"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.293060 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.298987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" event={"ID":"87dcea67-7f65-46a6-996b-3985bf1b5171","Type":"ContainerStarted","Data":"4cf5f0d88106b727b54d14ce775a20511fe8a7ddffdd0c044cf055e64587d28c"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.299675 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.301702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" event={"ID":"9cac78ed-6325-4649-bb05-a1518ae692e9","Type":"ContainerStarted","Data":"6e15552f7b63ced1a696101c769a57ef850ee06be84603155bd1e7b8d4a0ab75"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.302726 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.315611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" event={"ID":"d1dbd80a-0782-4035-a263-b52a90f6ee0e","Type":"ContainerStarted","Data":"9b0cd4027cc0b41c55b88ed38012ebbfd8115d1321fa518047bc63c6626772dc"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.316398 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.322715 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" podStartSLOduration=3.109531434 podStartE2EDuration="18.32269215s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.285801599 +0000 UTC m=+1006.262265223" lastFinishedPulling="2026-03-21 05:04:28.498962315 +0000 UTC m=+1021.475425939" observedRunningTime="2026-03-21 05:04:29.321590979 +0000 UTC m=+1022.298054603" watchObservedRunningTime="2026-03-21 05:04:29.32269215 +0000 UTC m=+1022.299155784" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.355283 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" event={"ID":"02b6af47-2c06-480b-a838-2d742efa1045","Type":"ContainerStarted","Data":"814e99ad1de41e07f65980c021d5b5f6979fbdaf273ce4c227304bbbd964feca"} Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.355788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.388534 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" podStartSLOduration=5.326768252 podStartE2EDuration="18.388518976s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.928448781 +0000 UTC m=+1005.904912405" lastFinishedPulling="2026-03-21 05:04:25.990199485 +0000 UTC m=+1018.966663129" observedRunningTime="2026-03-21 05:04:29.385426568 +0000 UTC m=+1022.361890192" watchObservedRunningTime="2026-03-21 05:04:29.388518976 +0000 UTC m=+1022.364982590" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.433876 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" podStartSLOduration=2.442450467 podStartE2EDuration="18.43385771s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.483651775 +0000 UTC m=+1005.460115399" lastFinishedPulling="2026-03-21 05:04:28.475059018 +0000 UTC m=+1021.451522642" observedRunningTime="2026-03-21 05:04:29.432361627 +0000 UTC m=+1022.408825251" watchObservedRunningTime="2026-03-21 05:04:29.43385771 +0000 UTC m=+1022.410321334" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.505008 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" podStartSLOduration=3.336834745 podStartE2EDuration="18.504993236s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.335204919 +0000 UTC m=+1006.311668543" lastFinishedPulling="2026-03-21 05:04:28.50336342 +0000 UTC m=+1021.479827034" observedRunningTime="2026-03-21 05:04:29.503639297 +0000 UTC m=+1022.480102921" watchObservedRunningTime="2026-03-21 05:04:29.504993236 +0000 UTC m=+1022.481456860" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.623316 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" podStartSLOduration=2.751796615 podStartE2EDuration="18.623300909s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.632650548 +0000 UTC m=+1005.609114172" lastFinishedPulling="2026-03-21 05:04:28.504154842 +0000 UTC m=+1021.480618466" observedRunningTime="2026-03-21 05:04:29.574501046 +0000 UTC m=+1022.550964670" watchObservedRunningTime="2026-03-21 05:04:29.623300909 +0000 UTC m=+1022.599764533" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.624705 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" podStartSLOduration=2.62357499 podStartE2EDuration="18.624698038s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.475206936 +0000 UTC m=+1005.451670560" lastFinishedPulling="2026-03-21 05:04:28.476329984 +0000 UTC m=+1021.452793608" observedRunningTime="2026-03-21 05:04:29.619444269 +0000 UTC m=+1022.595907893" watchObservedRunningTime="2026-03-21 05:04:29.624698038 +0000 UTC m=+1022.601161652" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.689608 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" podStartSLOduration=3.42278902 podStartE2EDuration="18.689589267s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.330373872 +0000 UTC m=+1006.306837506" lastFinishedPulling="2026-03-21 05:04:28.597174129 +0000 UTC m=+1021.573637753" observedRunningTime="2026-03-21 05:04:29.685470621 +0000 UTC m=+1022.661934255" watchObservedRunningTime="2026-03-21 05:04:29.689589267 +0000 UTC m=+1022.666052891" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.749715 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" podStartSLOduration=3.519866902 podStartE2EDuration="18.749702891s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.28548837 +0000 UTC m=+1006.261951994" lastFinishedPulling="2026-03-21 05:04:28.515324359 +0000 UTC m=+1021.491787983" observedRunningTime="2026-03-21 05:04:29.749072323 +0000 UTC m=+1022.725535947" watchObservedRunningTime="2026-03-21 05:04:29.749702891 +0000 UTC m=+1022.726166505" Mar 21 05:04:29 crc kubenswrapper[4775]: I0321 05:04:29.781293 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" podStartSLOduration=6.000402412 podStartE2EDuration="19.781276616s" podCreationTimestamp="2026-03-21 05:04:10 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.212211153 +0000 UTC m=+1005.188674777" lastFinishedPulling="2026-03-21 05:04:25.993085347 +0000 UTC m=+1018.969548981" observedRunningTime="2026-03-21 05:04:29.77753074 +0000 UTC m=+1022.753994364" watchObservedRunningTime="2026-03-21 05:04:29.781276616 +0000 UTC m=+1022.757740240" Mar 21 05:04:32 crc kubenswrapper[4775]: I0321 05:04:32.482195 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:04:32 crc kubenswrapper[4775]: I0321 05:04:32.482750 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:04:32 crc kubenswrapper[4775]: I0321 05:04:32.482808 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:04:32 crc kubenswrapper[4775]: I0321 05:04:32.483365 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d303e1558b2adccb16582693a30248fb7f96ca561d7dcb3104197e825dd15a7"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:04:32 crc kubenswrapper[4775]: I0321 05:04:32.483571 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://8d303e1558b2adccb16582693a30248fb7f96ca561d7dcb3104197e825dd15a7" gracePeriod=600 Mar 21 05:04:33 crc kubenswrapper[4775]: I0321 05:04:33.391652 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="8d303e1558b2adccb16582693a30248fb7f96ca561d7dcb3104197e825dd15a7" exitCode=0 Mar 21 05:04:33 crc kubenswrapper[4775]: I0321 05:04:33.391705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"8d303e1558b2adccb16582693a30248fb7f96ca561d7dcb3104197e825dd15a7"} Mar 21 05:04:33 crc kubenswrapper[4775]: I0321 05:04:33.391747 4775 scope.go:117] "RemoveContainer" containerID="eaf646a6237d4b4ce6a8a82755505f01a336cf93fef407f02dbf82d68f5008b4" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.409415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" event={"ID":"95a44b12-e027-400d-b257-99f2012251d8","Type":"ContainerStarted","Data":"b9109d8ccbf32cbfa0cff8c61c0680d0eb7eb54e6d830c96fe75217312b7cdc8"} Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.411793 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" event={"ID":"9bfe7d25-53ea-484d-a481-0ea04ee2b8a8","Type":"ContainerStarted","Data":"8d16199ef493b24538615a53998624f122687acb65de5bf87a5779df74d2e4de"} Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.412356 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.414290 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" event={"ID":"3c9f18bd-def6-45ff-a92b-25c6f40d6bb5","Type":"ContainerStarted","Data":"451c6ed00601aa4e554b35fe9e8592269a0eeafa83748f0324e01ae4f1e7fdd3"} Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.415357 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.423683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"09c71a6e96dc622a58adf5b83a67eab26ff45301d2bfcc2a43f1cbd9eb2d9791"} Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.426452 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" event={"ID":"ec38d53e-6fe4-41b5-8548-e49fadd9d6bf","Type":"ContainerStarted","Data":"6ac8d07514471eaf2df48765e096536120a65a71579498a3b896067f295f9f69"} Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.426734 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.431785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" event={"ID":"6eeb04ad-7251-488c-bd52-b2f14f6fb68b","Type":"ContainerStarted","Data":"9705459b49ed17e605e6a92451d276563775b14e8862b733af598edd781b6505"} Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.432020 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.439145 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" podStartSLOduration=8.906206983 podStartE2EDuration="24.439109781s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.943598741 +0000 UTC m=+1005.920062365" lastFinishedPulling="2026-03-21 05:04:28.476501539 +0000 UTC m=+1021.452965163" observedRunningTime="2026-03-21 05:04:29.811391369 +0000 UTC m=+1022.787854993" watchObservedRunningTime="2026-03-21 05:04:35.439109781 +0000 UTC m=+1028.415573415" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.452461 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" podStartSLOduration=3.138134134 podStartE2EDuration="24.452443349s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.351536662 +0000 UTC m=+1006.328000286" lastFinishedPulling="2026-03-21 05:04:34.665845877 +0000 UTC m=+1027.642309501" observedRunningTime="2026-03-21 05:04:35.448871448 +0000 UTC m=+1028.425335092" watchObservedRunningTime="2026-03-21 05:04:35.452443349 +0000 UTC m=+1028.428906993" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.453282 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jvglf" podStartSLOduration=3.054208466 podStartE2EDuration="24.453276763s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.351381317 +0000 UTC m=+1006.327844941" lastFinishedPulling="2026-03-21 05:04:34.750449594 +0000 UTC m=+1027.726913238" observedRunningTime="2026-03-21 05:04:35.43764128 +0000 UTC m=+1028.414104904" watchObservedRunningTime="2026-03-21 05:04:35.453276763 +0000 UTC m=+1028.429740397" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.490295 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" podStartSLOduration=3.193918865 podStartE2EDuration="24.490276851s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.368200604 +0000 UTC m=+1006.344664228" lastFinishedPulling="2026-03-21 05:04:34.6645586 +0000 UTC m=+1027.641022214" observedRunningTime="2026-03-21 05:04:35.471803458 +0000 UTC m=+1028.448267102" watchObservedRunningTime="2026-03-21 05:04:35.490276851 +0000 UTC m=+1028.466740475" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.490867 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" podStartSLOduration=3.185789635 podStartE2EDuration="24.490861738s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.35149286 +0000 UTC m=+1006.327956484" lastFinishedPulling="2026-03-21 05:04:34.656564963 +0000 UTC m=+1027.633028587" observedRunningTime="2026-03-21 05:04:35.488645905 +0000 UTC m=+1028.465109539" watchObservedRunningTime="2026-03-21 05:04:35.490861738 +0000 UTC m=+1028.467325362" Mar 21 05:04:35 crc kubenswrapper[4775]: I0321 05:04:35.531054 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" podStartSLOduration=3.226752787 podStartE2EDuration="24.531032797s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:13.351640975 +0000 UTC m=+1006.328104599" lastFinishedPulling="2026-03-21 05:04:34.655920985 +0000 UTC m=+1027.632384609" observedRunningTime="2026-03-21 05:04:35.525637474 +0000 UTC m=+1028.502101118" watchObservedRunningTime="2026-03-21 05:04:35.531032797 +0000 UTC m=+1028.507496421" Mar 21 05:04:37 crc kubenswrapper[4775]: I0321 05:04:37.405760 4775 scope.go:117] "RemoveContainer" containerID="a6f778f857c6aa3ee7f297014df3be268b9de30de7297b84506cc38e86b9d649" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.342435 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-twhxx" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.343197 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dn22m" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.384168 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-lxvtw" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.404900 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lzvsq" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.445377 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-7bdbc" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.446403 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-m4gqz" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.549262 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-9g8pv" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.719001 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-jq88h" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.749364 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-hf688" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.759807 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-4tbvg" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.801837 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-tss7r" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.826487 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-67scw" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.895954 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jj4pt" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.941282 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-ctm9h" Mar 21 05:04:41 crc kubenswrapper[4775]: I0321 05:04:41.973973 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lqmgv" Mar 21 05:04:42 crc kubenswrapper[4775]: I0321 05:04:42.084903 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-9s82h" Mar 21 05:04:42 crc kubenswrapper[4775]: I0321 05:04:42.100665 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-l59gx" Mar 21 05:04:42 crc kubenswrapper[4775]: I0321 05:04:42.208502 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-gdrc5" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.167940 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.182241 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a66456f-7860-4dc1-9c1c-0db69ddcc800-cert\") pod \"infra-operator-controller-manager-65f65cc49c-2mgp8\" (UID: \"0a66456f-7860-4dc1-9c1c-0db69ddcc800\") " pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.275738 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-fdr25" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.284877 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.476663 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.482537 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3035739a-202f-4794-bb4f-ae2342a96441-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-rwq59\" (UID: \"3035739a-202f-4794-bb4f-ae2342a96441\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.658141 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cthp5" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.665421 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:43 crc kubenswrapper[4775]: W0321 05:04:43.700177 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a66456f_7860_4dc1_9c1c_0db69ddcc800.slice/crio-f3476740abd8ffe4ea69aa39b1d69aa40ae063bd15f785b821410788c847f2e1 WatchSource:0}: Error finding container f3476740abd8ffe4ea69aa39b1d69aa40ae063bd15f785b821410788c847f2e1: Status 404 returned error can't find the container with id f3476740abd8ffe4ea69aa39b1d69aa40ae063bd15f785b821410788c847f2e1 Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.701586 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8"] Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.781515 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.781700 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.790924 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-metrics-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.793860 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/907f0cdf-2d87-4d09-97af-5591d061b4f6-webhook-certs\") pod \"openstack-operator-controller-manager-65746ff4dc-hg4rq\" (UID: \"907f0cdf-2d87-4d09-97af-5591d061b4f6\") " pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.891480 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zqnnz" Mar 21 05:04:43 crc kubenswrapper[4775]: I0321 05:04:43.899896 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:44 crc kubenswrapper[4775]: I0321 05:04:44.095839 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59"] Mar 21 05:04:44 crc kubenswrapper[4775]: I0321 05:04:44.126326 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq"] Mar 21 05:04:44 crc kubenswrapper[4775]: W0321 05:04:44.126381 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod907f0cdf_2d87_4d09_97af_5591d061b4f6.slice/crio-0b21472eea31b0e2d7f39cfe5e58d32c32feb754ddcc17473770ec2ea521949c WatchSource:0}: Error finding container 0b21472eea31b0e2d7f39cfe5e58d32c32feb754ddcc17473770ec2ea521949c: Status 404 returned error can't find the container with id 0b21472eea31b0e2d7f39cfe5e58d32c32feb754ddcc17473770ec2ea521949c Mar 21 05:04:44 crc kubenswrapper[4775]: I0321 05:04:44.505497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" event={"ID":"907f0cdf-2d87-4d09-97af-5591d061b4f6","Type":"ContainerStarted","Data":"0b21472eea31b0e2d7f39cfe5e58d32c32feb754ddcc17473770ec2ea521949c"} Mar 21 05:04:44 crc kubenswrapper[4775]: I0321 05:04:44.507568 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" event={"ID":"0a66456f-7860-4dc1-9c1c-0db69ddcc800","Type":"ContainerStarted","Data":"f3476740abd8ffe4ea69aa39b1d69aa40ae063bd15f785b821410788c847f2e1"} Mar 21 05:04:44 crc kubenswrapper[4775]: I0321 05:04:44.508962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" event={"ID":"3035739a-202f-4794-bb4f-ae2342a96441","Type":"ContainerStarted","Data":"aaf4afecb3a24559f5e03ae7f8304d8221883557cd03befa4b7f56c9f9157e02"} Mar 21 05:04:49 crc kubenswrapper[4775]: I0321 05:04:49.547106 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" event={"ID":"907f0cdf-2d87-4d09-97af-5591d061b4f6","Type":"ContainerStarted","Data":"1a4bea49d6fe2de12f4e42fa90462b35cc170947f406707e3b3cac159711d6cd"} Mar 21 05:04:50 crc kubenswrapper[4775]: I0321 05:04:50.552333 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:04:50 crc kubenswrapper[4775]: I0321 05:04:50.585315 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" podStartSLOduration=39.585295485 podStartE2EDuration="39.585295485s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:04:50.582765003 +0000 UTC m=+1043.559228667" watchObservedRunningTime="2026-03-21 05:04:50.585295485 +0000 UTC m=+1043.561759109" Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.573808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" event={"ID":"0a66456f-7860-4dc1-9c1c-0db69ddcc800","Type":"ContainerStarted","Data":"2288ecffa8b7fc13461f7bb72cc6b3b81443765bc5a9cb5baf4b63c40a61978f"} Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.575421 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.577340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" event={"ID":"3035739a-202f-4794-bb4f-ae2342a96441","Type":"ContainerStarted","Data":"6ddf9d8507a99c5f6069ed20292ddc2ddd0d77f55a16a4db939eb0792bc68880"} Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.578055 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.580869 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" event={"ID":"49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd","Type":"ContainerStarted","Data":"8fad59adcb09e434567ccb04bde888708623627b90fef5c5cb347ba318a1911d"} Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.581205 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.617665 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" podStartSLOduration=2.89936375 podStartE2EDuration="42.617649031s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:12.92805004 +0000 UTC m=+1005.904513664" lastFinishedPulling="2026-03-21 05:04:52.646335321 +0000 UTC m=+1045.622798945" observedRunningTime="2026-03-21 05:04:53.615331945 +0000 UTC m=+1046.591795569" watchObservedRunningTime="2026-03-21 05:04:53.617649031 +0000 UTC m=+1046.594112655" Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.620024 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" podStartSLOduration=33.642300307 podStartE2EDuration="42.620016948s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:43.704537096 +0000 UTC m=+1036.681000760" lastFinishedPulling="2026-03-21 05:04:52.682253777 +0000 UTC m=+1045.658717401" observedRunningTime="2026-03-21 05:04:53.591835891 +0000 UTC m=+1046.568299525" watchObservedRunningTime="2026-03-21 05:04:53.620016948 +0000 UTC m=+1046.596480582" Mar 21 05:04:53 crc kubenswrapper[4775]: I0321 05:04:53.641998 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" podStartSLOduration=34.068200405 podStartE2EDuration="42.641978849s" podCreationTimestamp="2026-03-21 05:04:11 +0000 UTC" firstStartedPulling="2026-03-21 05:04:44.10766935 +0000 UTC m=+1037.084132974" lastFinishedPulling="2026-03-21 05:04:52.681447794 +0000 UTC m=+1045.657911418" observedRunningTime="2026-03-21 05:04:53.637761299 +0000 UTC m=+1046.614224923" watchObservedRunningTime="2026-03-21 05:04:53.641978849 +0000 UTC m=+1046.618442473" Mar 21 05:05:01 crc kubenswrapper[4775]: I0321 05:05:01.579653 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-c7pjw" Mar 21 05:05:03 crc kubenswrapper[4775]: I0321 05:05:03.293760 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-65f65cc49c-2mgp8" Mar 21 05:05:03 crc kubenswrapper[4775]: I0321 05:05:03.679187 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-rwq59" Mar 21 05:05:03 crc kubenswrapper[4775]: I0321 05:05:03.912437 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65746ff4dc-hg4rq" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.774604 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d942p"] Mar 21 05:05:19 crc kubenswrapper[4775]: E0321 05:05:19.776376 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="extract-utilities" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.776398 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="extract-utilities" Mar 21 05:05:19 crc kubenswrapper[4775]: E0321 05:05:19.776414 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="registry-server" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.776422 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="registry-server" Mar 21 05:05:19 crc kubenswrapper[4775]: E0321 05:05:19.776440 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="extract-content" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.776448 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="extract-content" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.776657 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="694f1c66-ce56-4854-9f4c-02a42e16046d" containerName="registry-server" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.777564 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.779411 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.779623 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-sndkl" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.780548 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.781137 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.792376 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d942p"] Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.827330 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bqfch"] Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.828610 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.835471 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.843476 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bqfch"] Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.968973 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.969328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c4e9af-66f9-4f32-8c9a-30d4907a753c-config\") pod \"dnsmasq-dns-675f4bcbfc-d942p\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.969431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74n7g\" (UniqueName: \"kubernetes.io/projected/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-kube-api-access-74n7g\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.969805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-config\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:19 crc kubenswrapper[4775]: I0321 05:05:19.970028 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlpch\" (UniqueName: \"kubernetes.io/projected/62c4e9af-66f9-4f32-8c9a-30d4907a753c-kube-api-access-vlpch\") pod \"dnsmasq-dns-675f4bcbfc-d942p\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.072084 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.072464 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c4e9af-66f9-4f32-8c9a-30d4907a753c-config\") pod \"dnsmasq-dns-675f4bcbfc-d942p\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.072487 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74n7g\" (UniqueName: \"kubernetes.io/projected/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-kube-api-access-74n7g\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.072852 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-config\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.073080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlpch\" (UniqueName: \"kubernetes.io/projected/62c4e9af-66f9-4f32-8c9a-30d4907a753c-kube-api-access-vlpch\") pod \"dnsmasq-dns-675f4bcbfc-d942p\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.073423 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c4e9af-66f9-4f32-8c9a-30d4907a753c-config\") pod \"dnsmasq-dns-675f4bcbfc-d942p\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.073740 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.074427 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-config\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.095203 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlpch\" (UniqueName: \"kubernetes.io/projected/62c4e9af-66f9-4f32-8c9a-30d4907a753c-kube-api-access-vlpch\") pod \"dnsmasq-dns-675f4bcbfc-d942p\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.101886 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74n7g\" (UniqueName: \"kubernetes.io/projected/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-kube-api-access-74n7g\") pod \"dnsmasq-dns-78dd6ddcc-bqfch\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.103983 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.156108 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.595456 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d942p"] Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.634957 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bqfch"] Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.762474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" event={"ID":"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d","Type":"ContainerStarted","Data":"59274326bf041a9ff442e3ad650874c430fb6d402b02a40d2d8ac6b554c247ca"} Mar 21 05:05:20 crc kubenswrapper[4775]: I0321 05:05:20.763825 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" event={"ID":"62c4e9af-66f9-4f32-8c9a-30d4907a753c","Type":"ContainerStarted","Data":"a875cb7a582da7fe33690731a6ec7fd7d012b80f676cca55452c73ba7b74cfd0"} Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.722765 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d942p"] Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.741365 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4rt6c"] Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.742461 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.763283 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4rt6c"] Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.817808 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.817890 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-config\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.817925 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zth55\" (UniqueName: \"kubernetes.io/projected/6b9920a8-df59-4747-8f72-1f25d22517b3-kube-api-access-zth55\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.918751 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zth55\" (UniqueName: \"kubernetes.io/projected/6b9920a8-df59-4747-8f72-1f25d22517b3-kube-api-access-zth55\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.918852 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.918879 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-config\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.919732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-config\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.919748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:22 crc kubenswrapper[4775]: I0321 05:05:22.945256 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zth55\" (UniqueName: \"kubernetes.io/projected/6b9920a8-df59-4747-8f72-1f25d22517b3-kube-api-access-zth55\") pod \"dnsmasq-dns-5ccc8479f9-4rt6c\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.004191 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bqfch"] Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.026205 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mn5xb"] Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.027332 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.032966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mn5xb"] Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.085383 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.226498 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f77fw\" (UniqueName: \"kubernetes.io/projected/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-kube-api-access-f77fw\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.226828 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.226887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-config\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.328437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f77fw\" (UniqueName: \"kubernetes.io/projected/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-kube-api-access-f77fw\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.328570 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.328611 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-config\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.330473 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-config\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.330634 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.367988 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f77fw\" (UniqueName: \"kubernetes.io/projected/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-kube-api-access-f77fw\") pod \"dnsmasq-dns-57d769cc4f-mn5xb\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.609037 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4rt6c"] Mar 21 05:05:23 crc kubenswrapper[4775]: W0321 05:05:23.621770 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b9920a8_df59_4747_8f72_1f25d22517b3.slice/crio-70d41f601713d43b2beb92b797cb5b68ffe82649d11afaeaa30ae3ca5021d995 WatchSource:0}: Error finding container 70d41f601713d43b2beb92b797cb5b68ffe82649d11afaeaa30ae3ca5021d995: Status 404 returned error can't find the container with id 70d41f601713d43b2beb92b797cb5b68ffe82649d11afaeaa30ae3ca5021d995 Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.642838 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.808180 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" event={"ID":"6b9920a8-df59-4747-8f72-1f25d22517b3","Type":"ContainerStarted","Data":"70d41f601713d43b2beb92b797cb5b68ffe82649d11afaeaa30ae3ca5021d995"} Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.874052 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.877568 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.881648 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.881807 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.881838 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.881968 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.881980 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5pn7h" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.882263 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.886214 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 05:05:23 crc kubenswrapper[4775]: I0321 05:05:23.897551 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.037751 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038108 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375fb8b7-b673-4fd7-ae51-5f82f33c196f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038230 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038349 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdt6\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-kube-api-access-vqdt6\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375fb8b7-b673-4fd7-ae51-5f82f33c196f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038490 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.038557 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.139892 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.139944 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.139987 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140008 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375fb8b7-b673-4fd7-ae51-5f82f33c196f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140068 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140083 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140103 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdt6\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-kube-api-access-vqdt6\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140152 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375fb8b7-b673-4fd7-ae51-5f82f33c196f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140175 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.140541 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.141268 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.141403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.141696 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.143319 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.143948 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.153687 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375fb8b7-b673-4fd7-ae51-5f82f33c196f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.153753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.154195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.163772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375fb8b7-b673-4fd7-ae51-5f82f33c196f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.168234 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mn5xb"] Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.170356 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdt6\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-kube-api-access-vqdt6\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.175081 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.178883 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.181391 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.181813 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.181961 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kmjnx" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.182253 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.182610 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.184528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.184755 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.194485 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.207341 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343483 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/839e915e-8197-48e9-8b69-56ac420a1eed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343542 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343576 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmtzj\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-kube-api-access-lmtzj\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343613 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343682 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343770 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343856 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-config-data\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343904 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.343996 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/839e915e-8197-48e9-8b69-56ac420a1eed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445552 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445596 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmtzj\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-kube-api-access-lmtzj\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445652 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-config-data\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445768 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/839e915e-8197-48e9-8b69-56ac420a1eed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.445841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/839e915e-8197-48e9-8b69-56ac420a1eed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.446168 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.446676 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.446805 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.446874 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-config-data\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.447411 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.448023 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.451311 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.452704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.457882 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/839e915e-8197-48e9-8b69-56ac420a1eed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.464012 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/839e915e-8197-48e9-8b69-56ac420a1eed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.464467 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmtzj\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-kube-api-access-lmtzj\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.479095 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.494925 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.555951 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.822937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" event={"ID":"2084b413-8c86-4cc0-89ce-0dbfc4049e9b","Type":"ContainerStarted","Data":"04459e09511e8f218103b62e1cd093e2c1d10ce88498abbdd40b243c9345ef58"} Mar 21 05:05:24 crc kubenswrapper[4775]: I0321 05:05:24.958519 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.033316 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.159876 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.160984 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.163987 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.164176 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hwff4" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.164187 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.171621 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.174205 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.182827 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276245 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-kolla-config\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76d205b7-bc2e-4dad-b513-457ff20d67e1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-config-data-default\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276409 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76d205b7-bc2e-4dad-b513-457ff20d67e1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276452 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d205b7-bc2e-4dad-b513-457ff20d67e1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276485 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.276525 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wcnd\" (UniqueName: \"kubernetes.io/projected/76d205b7-bc2e-4dad-b513-457ff20d67e1-kube-api-access-8wcnd\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379036 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379159 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wcnd\" (UniqueName: \"kubernetes.io/projected/76d205b7-bc2e-4dad-b513-457ff20d67e1-kube-api-access-8wcnd\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379271 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-kolla-config\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379320 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76d205b7-bc2e-4dad-b513-457ff20d67e1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379398 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-config-data-default\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76d205b7-bc2e-4dad-b513-457ff20d67e1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.379524 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d205b7-bc2e-4dad-b513-457ff20d67e1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.381133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.381183 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-kolla-config\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.381435 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.381952 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/76d205b7-bc2e-4dad-b513-457ff20d67e1-config-data-default\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.382201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/76d205b7-bc2e-4dad-b513-457ff20d67e1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.390492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/76d205b7-bc2e-4dad-b513-457ff20d67e1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.394436 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d205b7-bc2e-4dad-b513-457ff20d67e1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.399997 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wcnd\" (UniqueName: \"kubernetes.io/projected/76d205b7-bc2e-4dad-b513-457ff20d67e1-kube-api-access-8wcnd\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.454788 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"76d205b7-bc2e-4dad-b513-457ff20d67e1\") " pod="openstack/openstack-galera-0" Mar 21 05:05:25 crc kubenswrapper[4775]: I0321 05:05:25.486516 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.488436 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.490248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.492021 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-sr8tt" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.492711 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.493318 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.500543 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.513336 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595420 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bade9789-f227-44ab-b7fa-2173445cd381-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595468 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595495 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595533 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595550 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bade9789-f227-44ab-b7fa-2173445cd381-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595583 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade9789-f227-44ab-b7fa-2173445cd381-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595622 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pqb\" (UniqueName: \"kubernetes.io/projected/bade9789-f227-44ab-b7fa-2173445cd381-kube-api-access-88pqb\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.595660 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.699968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pqb\" (UniqueName: \"kubernetes.io/projected/bade9789-f227-44ab-b7fa-2173445cd381-kube-api-access-88pqb\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700023 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700063 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bade9789-f227-44ab-b7fa-2173445cd381-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700106 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bade9789-f227-44ab-b7fa-2173445cd381-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700219 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade9789-f227-44ab-b7fa-2173445cd381-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.700896 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bade9789-f227-44ab-b7fa-2173445cd381-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.701015 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.701256 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.701775 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.702082 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bade9789-f227-44ab-b7fa-2173445cd381-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.714312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bade9789-f227-44ab-b7fa-2173445cd381-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.714468 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bade9789-f227-44ab-b7fa-2173445cd381-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.717953 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pqb\" (UniqueName: \"kubernetes.io/projected/bade9789-f227-44ab-b7fa-2173445cd381-kube-api-access-88pqb\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.735186 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bade9789-f227-44ab-b7fa-2173445cd381\") " pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.810794 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.883749 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.884992 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.888567 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.888843 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-wnhbq" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.889001 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 21 05:05:26 crc kubenswrapper[4775]: I0321 05:05:26.895148 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.005614 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a8305a2-5178-437f-a896-314b34fa595e-kolla-config\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.005688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pvwr\" (UniqueName: \"kubernetes.io/projected/2a8305a2-5178-437f-a896-314b34fa595e-kube-api-access-6pvwr\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.005769 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8305a2-5178-437f-a896-314b34fa595e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.005930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a8305a2-5178-437f-a896-314b34fa595e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.005962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8305a2-5178-437f-a896-314b34fa595e-config-data\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.107234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a8305a2-5178-437f-a896-314b34fa595e-kolla-config\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.107279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pvwr\" (UniqueName: \"kubernetes.io/projected/2a8305a2-5178-437f-a896-314b34fa595e-kube-api-access-6pvwr\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.107307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8305a2-5178-437f-a896-314b34fa595e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.107325 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a8305a2-5178-437f-a896-314b34fa595e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.107353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8305a2-5178-437f-a896-314b34fa595e-config-data\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.108211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2a8305a2-5178-437f-a896-314b34fa595e-kolla-config\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.108326 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2a8305a2-5178-437f-a896-314b34fa595e-config-data\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.110360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a8305a2-5178-437f-a896-314b34fa595e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.111628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8305a2-5178-437f-a896-314b34fa595e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.130225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pvwr\" (UniqueName: \"kubernetes.io/projected/2a8305a2-5178-437f-a896-314b34fa595e-kube-api-access-6pvwr\") pod \"memcached-0\" (UID: \"2a8305a2-5178-437f-a896-314b34fa595e\") " pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.214985 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 05:05:27 crc kubenswrapper[4775]: W0321 05:05:27.440932 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375fb8b7_b673_4fd7_ae51_5f82f33c196f.slice/crio-56ec7d33ff14d09de4f061f404a6116851871ee2ca592e6f5daea8ccedd576e8 WatchSource:0}: Error finding container 56ec7d33ff14d09de4f061f404a6116851871ee2ca592e6f5daea8ccedd576e8: Status 404 returned error can't find the container with id 56ec7d33ff14d09de4f061f404a6116851871ee2ca592e6f5daea8ccedd576e8 Mar 21 05:05:27 crc kubenswrapper[4775]: W0321 05:05:27.443324 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod839e915e_8197_48e9_8b69_56ac420a1eed.slice/crio-ef63fbfadda52109f2b3ecef0f45ed06c3a90225a2c4d5559784277a4c40c9dd WatchSource:0}: Error finding container ef63fbfadda52109f2b3ecef0f45ed06c3a90225a2c4d5559784277a4c40c9dd: Status 404 returned error can't find the container with id ef63fbfadda52109f2b3ecef0f45ed06c3a90225a2c4d5559784277a4c40c9dd Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.852311 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"839e915e-8197-48e9-8b69-56ac420a1eed","Type":"ContainerStarted","Data":"ef63fbfadda52109f2b3ecef0f45ed06c3a90225a2c4d5559784277a4c40c9dd"} Mar 21 05:05:27 crc kubenswrapper[4775]: I0321 05:05:27.853424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"375fb8b7-b673-4fd7-ae51-5f82f33c196f","Type":"ContainerStarted","Data":"56ec7d33ff14d09de4f061f404a6116851871ee2ca592e6f5daea8ccedd576e8"} Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.255312 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.256435 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.258278 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-h2svw" Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.265739 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.356944 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6zwf\" (UniqueName: \"kubernetes.io/projected/31578b15-f84b-4862-ae52-6720dac8f5e2-kube-api-access-l6zwf\") pod \"kube-state-metrics-0\" (UID: \"31578b15-f84b-4862-ae52-6720dac8f5e2\") " pod="openstack/kube-state-metrics-0" Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.458149 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6zwf\" (UniqueName: \"kubernetes.io/projected/31578b15-f84b-4862-ae52-6720dac8f5e2-kube-api-access-l6zwf\") pod \"kube-state-metrics-0\" (UID: \"31578b15-f84b-4862-ae52-6720dac8f5e2\") " pod="openstack/kube-state-metrics-0" Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.490764 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6zwf\" (UniqueName: \"kubernetes.io/projected/31578b15-f84b-4862-ae52-6720dac8f5e2-kube-api-access-l6zwf\") pod \"kube-state-metrics-0\" (UID: \"31578b15-f84b-4862-ae52-6720dac8f5e2\") " pod="openstack/kube-state-metrics-0" Mar 21 05:05:29 crc kubenswrapper[4775]: I0321 05:05:29.577553 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.475889 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nmtjx"] Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.477152 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.479188 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.481481 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.494455 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-9cknj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.495729 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nmtjx"] Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.544948 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-frhpj"] Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.546417 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.561423 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-frhpj"] Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.622638 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a8e948c-2978-40c8-961b-1b010f7ea920-scripts\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.622717 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-run\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.622922 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8e948c-2978-40c8-961b-1b010f7ea920-ovn-controller-tls-certs\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.622991 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-log-ovn\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.623021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqkcl\" (UniqueName: \"kubernetes.io/projected/8a8e948c-2978-40c8-961b-1b010f7ea920-kube-api-access-wqkcl\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.623145 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-run-ovn\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.623247 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8e948c-2978-40c8-961b-1b010f7ea920-combined-ca-bundle\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.724901 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8e948c-2978-40c8-961b-1b010f7ea920-combined-ca-bundle\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.724964 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-log\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725008 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a8e948c-2978-40c8-961b-1b010f7ea920-scripts\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725054 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-run\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a367f7-4951-4e7c-889f-d147676654f8-scripts\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725163 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8e948c-2978-40c8-961b-1b010f7ea920-ovn-controller-tls-certs\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725190 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28vs\" (UniqueName: \"kubernetes.io/projected/88a367f7-4951-4e7c-889f-d147676654f8-kube-api-access-m28vs\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725212 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-log-ovn\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725293 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqkcl\" (UniqueName: \"kubernetes.io/projected/8a8e948c-2978-40c8-961b-1b010f7ea920-kube-api-access-wqkcl\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725617 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-run\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725694 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-etc-ovs\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725718 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-lib\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-log-ovn\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-run\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.725844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-run-ovn\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.726020 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a8e948c-2978-40c8-961b-1b010f7ea920-var-run-ovn\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.728397 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a8e948c-2978-40c8-961b-1b010f7ea920-scripts\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.731657 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a8e948c-2978-40c8-961b-1b010f7ea920-ovn-controller-tls-certs\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.740620 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a8e948c-2978-40c8-961b-1b010f7ea920-combined-ca-bundle\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.751463 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqkcl\" (UniqueName: \"kubernetes.io/projected/8a8e948c-2978-40c8-961b-1b010f7ea920-kube-api-access-wqkcl\") pod \"ovn-controller-nmtjx\" (UID: \"8a8e948c-2978-40c8-961b-1b010f7ea920\") " pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.797368 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.827288 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m28vs\" (UniqueName: \"kubernetes.io/projected/88a367f7-4951-4e7c-889f-d147676654f8-kube-api-access-m28vs\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.827360 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-run\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.827389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-etc-ovs\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.827412 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-lib\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.827471 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-log\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.827574 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a367f7-4951-4e7c-889f-d147676654f8-scripts\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.828295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-log\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.828821 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-run\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.828949 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-etc-ovs\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.829058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88a367f7-4951-4e7c-889f-d147676654f8-var-lib\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.830774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a367f7-4951-4e7c-889f-d147676654f8-scripts\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.846411 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m28vs\" (UniqueName: \"kubernetes.io/projected/88a367f7-4951-4e7c-889f-d147676654f8-kube-api-access-m28vs\") pod \"ovn-controller-ovs-frhpj\" (UID: \"88a367f7-4951-4e7c-889f-d147676654f8\") " pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:32 crc kubenswrapper[4775]: I0321 05:05:32.860787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.330989 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.334573 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.337901 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.338013 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.338156 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-947qk" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.338204 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.341285 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.349024 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.437729 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-config\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.437787 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.437817 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.437859 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.437888 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.437916 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.438002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.438039 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-kube-api-access-pxdms\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.540751 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.540819 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-kube-api-access-pxdms\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.540889 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-config\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.540914 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.540935 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.540966 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.540987 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.541011 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.542057 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.542354 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.542452 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-config\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.543164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.544789 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.547047 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.547610 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.557763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxdms\" (UniqueName: \"kubernetes.io/projected/b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e-kube-api-access-pxdms\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.562874 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e\") " pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:33 crc kubenswrapper[4775]: I0321 05:05:33.667485 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.024929 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.027273 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.030963 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.031266 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.038182 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.041580 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-n5ltt" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.089593 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.192390 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93886182-fca2-42a9-a134-2243c7c7073d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.192621 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.192722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93886182-fca2-42a9-a134-2243c7c7073d-config\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.192805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.192902 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.192984 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93886182-fca2-42a9-a134-2243c7c7073d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.193074 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf5qj\" (UniqueName: \"kubernetes.io/projected/93886182-fca2-42a9-a134-2243c7c7073d-kube-api-access-gf5qj\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.193195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294139 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294197 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93886182-fca2-42a9-a134-2243c7c7073d-config\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294262 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93886182-fca2-42a9-a134-2243c7c7073d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294293 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf5qj\" (UniqueName: \"kubernetes.io/projected/93886182-fca2-42a9-a134-2243c7c7073d-kube-api-access-gf5qj\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294320 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93886182-fca2-42a9-a134-2243c7c7073d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.294748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/93886182-fca2-42a9-a134-2243c7c7073d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.295901 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.296740 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93886182-fca2-42a9-a134-2243c7c7073d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.297829 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93886182-fca2-42a9-a134-2243c7c7073d-config\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.302858 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.303284 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.303465 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/93886182-fca2-42a9-a134-2243c7c7073d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.314425 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf5qj\" (UniqueName: \"kubernetes.io/projected/93886182-fca2-42a9-a134-2243c7c7073d-kube-api-access-gf5qj\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.333784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"93886182-fca2-42a9-a134-2243c7c7073d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:36 crc kubenswrapper[4775]: I0321 05:05:36.344066 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:37 crc kubenswrapper[4775]: I0321 05:05:37.224157 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 05:05:37 crc kubenswrapper[4775]: E0321 05:05:37.615236 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 05:05:37 crc kubenswrapper[4775]: E0321 05:05:37.615687 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlpch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-d942p_openstack(62c4e9af-66f9-4f32-8c9a-30d4907a753c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:05:37 crc kubenswrapper[4775]: E0321 05:05:37.616900 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" podUID="62c4e9af-66f9-4f32-8c9a-30d4907a753c" Mar 21 05:05:38 crc kubenswrapper[4775]: W0321 05:05:38.626807 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8305a2_5178_437f_a896_314b34fa595e.slice/crio-598009fbb019244e3b53d66fa63a72d4717d256096a6e5954a74b77181d01dc1 WatchSource:0}: Error finding container 598009fbb019244e3b53d66fa63a72d4717d256096a6e5954a74b77181d01dc1: Status 404 returned error can't find the container with id 598009fbb019244e3b53d66fa63a72d4717d256096a6e5954a74b77181d01dc1 Mar 21 05:05:38 crc kubenswrapper[4775]: E0321 05:05:38.660657 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 05:05:38 crc kubenswrapper[4775]: E0321 05:05:38.660833 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-74n7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-bqfch_openstack(e8c11166-8b08-43d6-84e3-e5cbd0adbd8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:05:38 crc kubenswrapper[4775]: E0321 05:05:38.662006 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" podUID="e8c11166-8b08-43d6-84e3-e5cbd0adbd8d" Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.746871 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.839252 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlpch\" (UniqueName: \"kubernetes.io/projected/62c4e9af-66f9-4f32-8c9a-30d4907a753c-kube-api-access-vlpch\") pod \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.840163 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c4e9af-66f9-4f32-8c9a-30d4907a753c-config\") pod \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\" (UID: \"62c4e9af-66f9-4f32-8c9a-30d4907a753c\") " Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.844024 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c4e9af-66f9-4f32-8c9a-30d4907a753c-config" (OuterVolumeSpecName: "config") pod "62c4e9af-66f9-4f32-8c9a-30d4907a753c" (UID: "62c4e9af-66f9-4f32-8c9a-30d4907a753c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.844268 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c4e9af-66f9-4f32-8c9a-30d4907a753c-kube-api-access-vlpch" (OuterVolumeSpecName: "kube-api-access-vlpch") pod "62c4e9af-66f9-4f32-8c9a-30d4907a753c" (UID: "62c4e9af-66f9-4f32-8c9a-30d4907a753c"). InnerVolumeSpecName "kube-api-access-vlpch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.942736 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlpch\" (UniqueName: \"kubernetes.io/projected/62c4e9af-66f9-4f32-8c9a-30d4907a753c-kube-api-access-vlpch\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.943074 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c4e9af-66f9-4f32-8c9a-30d4907a753c-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.980420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" event={"ID":"62c4e9af-66f9-4f32-8c9a-30d4907a753c","Type":"ContainerDied","Data":"a875cb7a582da7fe33690731a6ec7fd7d012b80f676cca55452c73ba7b74cfd0"} Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.980444 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-d942p" Mar 21 05:05:38 crc kubenswrapper[4775]: I0321 05:05:38.982202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2a8305a2-5178-437f-a896-314b34fa595e","Type":"ContainerStarted","Data":"598009fbb019244e3b53d66fa63a72d4717d256096a6e5954a74b77181d01dc1"} Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.073702 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d942p"] Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.082199 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-d942p"] Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.088245 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 05:05:39 crc kubenswrapper[4775]: W0321 05:05:39.088746 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbade9789_f227_44ab_b7fa_2173445cd381.slice/crio-71a4bfd304371129a48c8ee069c1f253612b58ce94a9b582afab27fc7cfffa37 WatchSource:0}: Error finding container 71a4bfd304371129a48c8ee069c1f253612b58ce94a9b582afab27fc7cfffa37: Status 404 returned error can't find the container with id 71a4bfd304371129a48c8ee069c1f253612b58ce94a9b582afab27fc7cfffa37 Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.431445 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:05:39 crc kubenswrapper[4775]: W0321 05:05:39.440439 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31578b15_f84b_4862_ae52_6720dac8f5e2.slice/crio-95eba21f713625c8d0e5f46fb846b1edb0522a4501294e1679ea681e44f67781 WatchSource:0}: Error finding container 95eba21f713625c8d0e5f46fb846b1edb0522a4501294e1679ea681e44f67781: Status 404 returned error can't find the container with id 95eba21f713625c8d0e5f46fb846b1edb0522a4501294e1679ea681e44f67781 Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.445174 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nmtjx"] Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.453767 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.491743 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.523728 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-frhpj"] Mar 21 05:05:39 crc kubenswrapper[4775]: W0321 05:05:39.524434 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88a367f7_4951_4e7c_889f_d147676654f8.slice/crio-c5fb9d3ae3c518dfedbe697bf5cdfbf1d08a6cfe78edab781073070def753b30 WatchSource:0}: Error finding container c5fb9d3ae3c518dfedbe697bf5cdfbf1d08a6cfe78edab781073070def753b30: Status 404 returned error can't find the container with id c5fb9d3ae3c518dfedbe697bf5cdfbf1d08a6cfe78edab781073070def753b30 Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.657301 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-config\") pod \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.657462 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-dns-svc\") pod \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.657569 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74n7g\" (UniqueName: \"kubernetes.io/projected/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-kube-api-access-74n7g\") pod \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\" (UID: \"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d\") " Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.658023 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8c11166-8b08-43d6-84e3-e5cbd0adbd8d" (UID: "e8c11166-8b08-43d6-84e3-e5cbd0adbd8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.658064 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-config" (OuterVolumeSpecName: "config") pod "e8c11166-8b08-43d6-84e3-e5cbd0adbd8d" (UID: "e8c11166-8b08-43d6-84e3-e5cbd0adbd8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.670902 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c4e9af-66f9-4f32-8c9a-30d4907a753c" path="/var/lib/kubelet/pods/62c4e9af-66f9-4f32-8c9a-30d4907a753c/volumes" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.759880 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.759918 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.796746 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-kube-api-access-74n7g" (OuterVolumeSpecName: "kube-api-access-74n7g") pod "e8c11166-8b08-43d6-84e3-e5cbd0adbd8d" (UID: "e8c11166-8b08-43d6-84e3-e5cbd0adbd8d"). InnerVolumeSpecName "kube-api-access-74n7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.861620 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74n7g\" (UniqueName: \"kubernetes.io/projected/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d-kube-api-access-74n7g\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.990178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"839e915e-8197-48e9-8b69-56ac420a1eed","Type":"ContainerStarted","Data":"5a7046cba7a8f5bae4cf5ebd270104b81a62d747ce978d3680ad2d2e0a16d243"} Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.992087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"76d205b7-bc2e-4dad-b513-457ff20d67e1","Type":"ContainerStarted","Data":"0ee342247a2ef74219a0dfefa06608353417f7dec8acfc9478104d9cbd2b4c18"} Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.993905 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"375fb8b7-b673-4fd7-ae51-5f82f33c196f","Type":"ContainerStarted","Data":"1c4292485e2c0ee4f0c83f962fffc48a32f990f8105220ac94e44a4691b7ff1f"} Mar 21 05:05:39 crc kubenswrapper[4775]: I0321 05:05:39.994774 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nmtjx" event={"ID":"8a8e948c-2978-40c8-961b-1b010f7ea920","Type":"ContainerStarted","Data":"1c5e5cd13d3918f6a03a83aeea0df7177a08b6000e20544475fcf11597f71a2d"} Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:39.998452 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" event={"ID":"e8c11166-8b08-43d6-84e3-e5cbd0adbd8d","Type":"ContainerDied","Data":"59274326bf041a9ff442e3ad650874c430fb6d402b02a40d2d8ac6b554c247ca"} Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:39.998515 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bqfch" Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:39.999659 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-frhpj" event={"ID":"88a367f7-4951-4e7c-889f-d147676654f8","Type":"ContainerStarted","Data":"c5fb9d3ae3c518dfedbe697bf5cdfbf1d08a6cfe78edab781073070def753b30"} Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.000532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bade9789-f227-44ab-b7fa-2173445cd381","Type":"ContainerStarted","Data":"71a4bfd304371129a48c8ee069c1f253612b58ce94a9b582afab27fc7cfffa37"} Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.002085 4775 generic.go:334] "Generic (PLEG): container finished" podID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerID="f749dd6bac424eb0695958e8b0d01fc15e3e824b72ea62322c8b3aee65b4d200" exitCode=0 Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.002176 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" event={"ID":"2084b413-8c86-4cc0-89ce-0dbfc4049e9b","Type":"ContainerDied","Data":"f749dd6bac424eb0695958e8b0d01fc15e3e824b72ea62322c8b3aee65b4d200"} Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.003102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"31578b15-f84b-4862-ae52-6720dac8f5e2","Type":"ContainerStarted","Data":"95eba21f713625c8d0e5f46fb846b1edb0522a4501294e1679ea681e44f67781"} Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.005462 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerID="e34c6feccb4c3623eb4d46c49fed2bc999053cbdc47d3e661349de5746c1ab7d" exitCode=0 Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.005492 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" event={"ID":"6b9920a8-df59-4747-8f72-1f25d22517b3","Type":"ContainerDied","Data":"e34c6feccb4c3623eb4d46c49fed2bc999053cbdc47d3e661349de5746c1ab7d"} Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.114193 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bqfch"] Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.120182 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bqfch"] Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.256094 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 05:05:40 crc kubenswrapper[4775]: I0321 05:05:40.376518 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 05:05:40 crc kubenswrapper[4775]: W0321 05:05:40.720797 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93886182_fca2_42a9_a134_2243c7c7073d.slice/crio-400536f01cd75d0d61183bbdf297be1b31f3ce3d729c063daac7126e63ada8d3 WatchSource:0}: Error finding container 400536f01cd75d0d61183bbdf297be1b31f3ce3d729c063daac7126e63ada8d3: Status 404 returned error can't find the container with id 400536f01cd75d0d61183bbdf297be1b31f3ce3d729c063daac7126e63ada8d3 Mar 21 05:05:40 crc kubenswrapper[4775]: W0321 05:05:40.723404 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4d6ed0c_fd91_4ed6_9a1c_0a31f78ed48e.slice/crio-de83a4eb6c823a9c0e42ab2a3e22cd64db36c616bd8f0506ecbc4b6065e74379 WatchSource:0}: Error finding container de83a4eb6c823a9c0e42ab2a3e22cd64db36c616bd8f0506ecbc4b6065e74379: Status 404 returned error can't find the container with id de83a4eb6c823a9c0e42ab2a3e22cd64db36c616bd8f0506ecbc4b6065e74379 Mar 21 05:05:41 crc kubenswrapper[4775]: I0321 05:05:41.016631 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e","Type":"ContainerStarted","Data":"de83a4eb6c823a9c0e42ab2a3e22cd64db36c616bd8f0506ecbc4b6065e74379"} Mar 21 05:05:41 crc kubenswrapper[4775]: I0321 05:05:41.018944 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"93886182-fca2-42a9-a134-2243c7c7073d","Type":"ContainerStarted","Data":"400536f01cd75d0d61183bbdf297be1b31f3ce3d729c063daac7126e63ada8d3"} Mar 21 05:05:41 crc kubenswrapper[4775]: I0321 05:05:41.674170 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c11166-8b08-43d6-84e3-e5cbd0adbd8d" path="/var/lib/kubelet/pods/e8c11166-8b08-43d6-84e3-e5cbd0adbd8d/volumes" Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.031143 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" event={"ID":"6b9920a8-df59-4747-8f72-1f25d22517b3","Type":"ContainerStarted","Data":"c5ba9d79e2eb52707c6820af305f99b1e2b4041e9f4aebd9fe6f0da5d03ca612"} Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.031525 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.034838 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" event={"ID":"2084b413-8c86-4cc0-89ce-0dbfc4049e9b","Type":"ContainerStarted","Data":"9a3778eec58b097b77fc9e80249e34ac71c57ec91481179476b24785158c24e3"} Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.034990 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.038217 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2a8305a2-5178-437f-a896-314b34fa595e","Type":"ContainerStarted","Data":"d148febd920b4396b058dabe656656d420318d45ca05c3371173ea771fb88de7"} Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.038316 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.054086 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" podStartSLOduration=4.894380628 podStartE2EDuration="20.054063448s" podCreationTimestamp="2026-03-21 05:05:22 +0000 UTC" firstStartedPulling="2026-03-21 05:05:23.62537611 +0000 UTC m=+1076.601839734" lastFinishedPulling="2026-03-21 05:05:38.78505893 +0000 UTC m=+1091.761522554" observedRunningTime="2026-03-21 05:05:42.048885632 +0000 UTC m=+1095.025349266" watchObservedRunningTime="2026-03-21 05:05:42.054063448 +0000 UTC m=+1095.030527072" Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.072012 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" podStartSLOduration=5.469301977 podStartE2EDuration="20.071992365s" podCreationTimestamp="2026-03-21 05:05:22 +0000 UTC" firstStartedPulling="2026-03-21 05:05:24.176916498 +0000 UTC m=+1077.153380122" lastFinishedPulling="2026-03-21 05:05:38.779606886 +0000 UTC m=+1091.756070510" observedRunningTime="2026-03-21 05:05:42.065038489 +0000 UTC m=+1095.041502113" watchObservedRunningTime="2026-03-21 05:05:42.071992365 +0000 UTC m=+1095.048455989" Mar 21 05:05:42 crc kubenswrapper[4775]: I0321 05:05:42.087788 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.930443911 podStartE2EDuration="16.087769652s" podCreationTimestamp="2026-03-21 05:05:26 +0000 UTC" firstStartedPulling="2026-03-21 05:05:38.645713179 +0000 UTC m=+1091.622176803" lastFinishedPulling="2026-03-21 05:05:40.80303892 +0000 UTC m=+1093.779502544" observedRunningTime="2026-03-21 05:05:42.084649913 +0000 UTC m=+1095.061113557" watchObservedRunningTime="2026-03-21 05:05:42.087769652 +0000 UTC m=+1095.064233276" Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.097249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"31578b15-f84b-4862-ae52-6720dac8f5e2","Type":"ContainerStarted","Data":"ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a"} Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.098017 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.099922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nmtjx" event={"ID":"8a8e948c-2978-40c8-961b-1b010f7ea920","Type":"ContainerStarted","Data":"5903ae6dcf222706531c88d4b76781af7c8d83e54858da54fffe258cfd24a8d1"} Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.100037 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nmtjx" Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.101408 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e","Type":"ContainerStarted","Data":"1b47eab34a6ed1ec2f0f84bb8a8895662f33297606bb0ccea9e7478bc25144cd"} Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.102955 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-frhpj" event={"ID":"88a367f7-4951-4e7c-889f-d147676654f8","Type":"ContainerStarted","Data":"c0c441602e7099da4eb86364cb4742ce7c9df7ca08a5b6193a116451d7ebf21f"} Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.104362 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bade9789-f227-44ab-b7fa-2173445cd381","Type":"ContainerStarted","Data":"48c166a356d78aa2e47a8f4d74ca291a5cf4a906e0b5939c25cc1658ceae0a52"} Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.106071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"93886182-fca2-42a9-a134-2243c7c7073d","Type":"ContainerStarted","Data":"07ba9f15899b01d878ebea12f580563712b0fb002c676c1b85c52d6bf53361c3"} Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.108717 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"76d205b7-bc2e-4dad-b513-457ff20d67e1","Type":"ContainerStarted","Data":"96c9a92b11a655f0110f2a971188cfd82d21616cb9cd4cd24d771176f3603b12"} Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.140978 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.074439585 podStartE2EDuration="18.140956757s" podCreationTimestamp="2026-03-21 05:05:29 +0000 UTC" firstStartedPulling="2026-03-21 05:05:39.443478401 +0000 UTC m=+1092.419942025" lastFinishedPulling="2026-03-21 05:05:46.509995573 +0000 UTC m=+1099.486459197" observedRunningTime="2026-03-21 05:05:47.115424825 +0000 UTC m=+1100.091888449" watchObservedRunningTime="2026-03-21 05:05:47.140956757 +0000 UTC m=+1100.117420381" Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.143649 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nmtjx" podStartSLOduration=8.489568095 podStartE2EDuration="15.143630943s" podCreationTimestamp="2026-03-21 05:05:32 +0000 UTC" firstStartedPulling="2026-03-21 05:05:39.463705923 +0000 UTC m=+1092.440169547" lastFinishedPulling="2026-03-21 05:05:46.117768771 +0000 UTC m=+1099.094232395" observedRunningTime="2026-03-21 05:05:47.134642299 +0000 UTC m=+1100.111105923" watchObservedRunningTime="2026-03-21 05:05:47.143630943 +0000 UTC m=+1100.120094567" Mar 21 05:05:47 crc kubenswrapper[4775]: I0321 05:05:47.216074 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 21 05:05:48 crc kubenswrapper[4775]: I0321 05:05:48.086320 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:48 crc kubenswrapper[4775]: I0321 05:05:48.128719 4775 generic.go:334] "Generic (PLEG): container finished" podID="88a367f7-4951-4e7c-889f-d147676654f8" containerID="c0c441602e7099da4eb86364cb4742ce7c9df7ca08a5b6193a116451d7ebf21f" exitCode=0 Mar 21 05:05:48 crc kubenswrapper[4775]: I0321 05:05:48.130079 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-frhpj" event={"ID":"88a367f7-4951-4e7c-889f-d147676654f8","Type":"ContainerDied","Data":"c0c441602e7099da4eb86364cb4742ce7c9df7ca08a5b6193a116451d7ebf21f"} Mar 21 05:05:48 crc kubenswrapper[4775]: I0321 05:05:48.645073 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:05:48 crc kubenswrapper[4775]: I0321 05:05:48.695513 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4rt6c"] Mar 21 05:05:48 crc kubenswrapper[4775]: I0321 05:05:48.695760 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" podUID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerName="dnsmasq-dns" containerID="cri-o://c5ba9d79e2eb52707c6820af305f99b1e2b4041e9f4aebd9fe6f0da5d03ca612" gracePeriod=10 Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.151384 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerID="c5ba9d79e2eb52707c6820af305f99b1e2b4041e9f4aebd9fe6f0da5d03ca612" exitCode=0 Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.151465 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" event={"ID":"6b9920a8-df59-4747-8f72-1f25d22517b3","Type":"ContainerDied","Data":"c5ba9d79e2eb52707c6820af305f99b1e2b4041e9f4aebd9fe6f0da5d03ca612"} Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.154679 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-frhpj" event={"ID":"88a367f7-4951-4e7c-889f-d147676654f8","Type":"ContainerStarted","Data":"3fb00cdb5714dbc33207822ccac44aa4fc4bbc07c11261280f69e6610060af07"} Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.697615 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fvhb4"] Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.700653 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.717088 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fvhb4"] Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.840313 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zcdx\" (UniqueName: \"kubernetes.io/projected/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-kube-api-access-8zcdx\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.840536 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.840595 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-config\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.941991 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zcdx\" (UniqueName: \"kubernetes.io/projected/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-kube-api-access-8zcdx\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.942163 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.943211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.943286 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-config\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.946038 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-config\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:49 crc kubenswrapper[4775]: I0321 05:05:49.995389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zcdx\" (UniqueName: \"kubernetes.io/projected/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-kube-api-access-8zcdx\") pod \"dnsmasq-dns-7cb5889db5-fvhb4\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.059102 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.172903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-frhpj" event={"ID":"88a367f7-4951-4e7c-889f-d147676654f8","Type":"ContainerStarted","Data":"90b39826220e47692b56ceb10eb4f54520e5729930cd0d0bc39ceb38bad7d13d"} Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.174245 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.174279 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.205648 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-frhpj" podStartSLOduration=12.47662192 podStartE2EDuration="18.205631277s" podCreationTimestamp="2026-03-21 05:05:32 +0000 UTC" firstStartedPulling="2026-03-21 05:05:39.526398336 +0000 UTC m=+1092.502861960" lastFinishedPulling="2026-03-21 05:05:45.255407693 +0000 UTC m=+1098.231871317" observedRunningTime="2026-03-21 05:05:50.197794945 +0000 UTC m=+1103.174258579" watchObservedRunningTime="2026-03-21 05:05:50.205631277 +0000 UTC m=+1103.182094901" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.672137 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.757364 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zth55\" (UniqueName: \"kubernetes.io/projected/6b9920a8-df59-4747-8f72-1f25d22517b3-kube-api-access-zth55\") pod \"6b9920a8-df59-4747-8f72-1f25d22517b3\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.757456 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-config\") pod \"6b9920a8-df59-4747-8f72-1f25d22517b3\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.757504 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-dns-svc\") pod \"6b9920a8-df59-4747-8f72-1f25d22517b3\" (UID: \"6b9920a8-df59-4747-8f72-1f25d22517b3\") " Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.761920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9920a8-df59-4747-8f72-1f25d22517b3-kube-api-access-zth55" (OuterVolumeSpecName: "kube-api-access-zth55") pod "6b9920a8-df59-4747-8f72-1f25d22517b3" (UID: "6b9920a8-df59-4747-8f72-1f25d22517b3"). InnerVolumeSpecName "kube-api-access-zth55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.792566 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b9920a8-df59-4747-8f72-1f25d22517b3" (UID: "6b9920a8-df59-4747-8f72-1f25d22517b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.793431 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-config" (OuterVolumeSpecName: "config") pod "6b9920a8-df59-4747-8f72-1f25d22517b3" (UID: "6b9920a8-df59-4747-8f72-1f25d22517b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.864511 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zth55\" (UniqueName: \"kubernetes.io/projected/6b9920a8-df59-4747-8f72-1f25d22517b3-kube-api-access-zth55\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.864547 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.864557 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b9920a8-df59-4747-8f72-1f25d22517b3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.900266 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 21 05:05:50 crc kubenswrapper[4775]: E0321 05:05:50.900742 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerName="init" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.900760 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerName="init" Mar 21 05:05:50 crc kubenswrapper[4775]: E0321 05:05:50.900774 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerName="dnsmasq-dns" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.900782 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerName="dnsmasq-dns" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.901011 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9920a8-df59-4747-8f72-1f25d22517b3" containerName="dnsmasq-dns" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.906910 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.907049 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.908820 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.909018 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.909261 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-t4d72" Mar 21 05:05:50 crc kubenswrapper[4775]: I0321 05:05:50.909664 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.067396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8e93b938-c138-4cfc-a227-e1cd648ad59a-lock\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.067793 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e93b938-c138-4cfc-a227-e1cd648ad59a-cache\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.067876 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z78kf\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-kube-api-access-z78kf\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.067906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.067924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93b938-c138-4cfc-a227-e1cd648ad59a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.067980 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.169329 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.169411 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8e93b938-c138-4cfc-a227-e1cd648ad59a-lock\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.169468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e93b938-c138-4cfc-a227-e1cd648ad59a-cache\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: E0321 05:05:51.169549 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:05:51 crc kubenswrapper[4775]: E0321 05:05:51.169577 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:05:51 crc kubenswrapper[4775]: E0321 05:05:51.169635 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift podName:8e93b938-c138-4cfc-a227-e1cd648ad59a nodeName:}" failed. No retries permitted until 2026-03-21 05:05:51.669614739 +0000 UTC m=+1104.646078413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift") pod "swift-storage-0" (UID: "8e93b938-c138-4cfc-a227-e1cd648ad59a") : configmap "swift-ring-files" not found Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.169693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z78kf\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-kube-api-access-z78kf\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.169753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.169774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93b938-c138-4cfc-a227-e1cd648ad59a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.169893 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8e93b938-c138-4cfc-a227-e1cd648ad59a-lock\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.170032 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8e93b938-c138-4cfc-a227-e1cd648ad59a-cache\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.170136 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.173638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e93b938-c138-4cfc-a227-e1cd648ad59a-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.185274 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" event={"ID":"6b9920a8-df59-4747-8f72-1f25d22517b3","Type":"ContainerDied","Data":"70d41f601713d43b2beb92b797cb5b68ffe82649d11afaeaa30ae3ca5021d995"} Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.185329 4775 scope.go:117] "RemoveContainer" containerID="c5ba9d79e2eb52707c6820af305f99b1e2b4041e9f4aebd9fe6f0da5d03ca612" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.185333 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4rt6c" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.190727 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z78kf\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-kube-api-access-z78kf\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.192942 4775 generic.go:334] "Generic (PLEG): container finished" podID="bade9789-f227-44ab-b7fa-2173445cd381" containerID="48c166a356d78aa2e47a8f4d74ca291a5cf4a906e0b5939c25cc1658ceae0a52" exitCode=0 Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.193000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bade9789-f227-44ab-b7fa-2173445cd381","Type":"ContainerDied","Data":"48c166a356d78aa2e47a8f4d74ca291a5cf4a906e0b5939c25cc1658ceae0a52"} Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.199486 4775 generic.go:334] "Generic (PLEG): container finished" podID="76d205b7-bc2e-4dad-b513-457ff20d67e1" containerID="96c9a92b11a655f0110f2a971188cfd82d21616cb9cd4cd24d771176f3603b12" exitCode=0 Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.199582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"76d205b7-bc2e-4dad-b513-457ff20d67e1","Type":"ContainerDied","Data":"96c9a92b11a655f0110f2a971188cfd82d21616cb9cd4cd24d771176f3603b12"} Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.204030 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.213237 4775 scope.go:117] "RemoveContainer" containerID="e34c6feccb4c3623eb4d46c49fed2bc999053cbdc47d3e661349de5746c1ab7d" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.273893 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4rt6c"] Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.283743 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4rt6c"] Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.325567 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fvhb4"] Mar 21 05:05:51 crc kubenswrapper[4775]: W0321 05:05:51.332980 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82af84d9_ff4c_4c51_9b1c_62f1c74b7de5.slice/crio-e02dcc00a6516fa9f13285030a0cc8babf7e17cec678d5d2033e37c7b73146b1 WatchSource:0}: Error finding container e02dcc00a6516fa9f13285030a0cc8babf7e17cec678d5d2033e37c7b73146b1: Status 404 returned error can't find the container with id e02dcc00a6516fa9f13285030a0cc8babf7e17cec678d5d2033e37c7b73146b1 Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.452414 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kzll6"] Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.453540 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.455947 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.461099 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.461380 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.466723 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kzll6"] Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.576636 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-swiftconf\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.576706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-scripts\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.576756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-ring-data-devices\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.576790 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-combined-ca-bundle\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.576854 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e521c27-9d67-47bc-b6ac-74fabb543d3f-etc-swift\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.576877 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r55zr\" (UniqueName: \"kubernetes.io/projected/9e521c27-9d67-47bc-b6ac-74fabb543d3f-kube-api-access-r55zr\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.576934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-dispersionconf\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.670910 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9920a8-df59-4747-8f72-1f25d22517b3" path="/var/lib/kubelet/pods/6b9920a8-df59-4747-8f72-1f25d22517b3/volumes" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-dispersionconf\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678552 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-swiftconf\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678597 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-scripts\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678636 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678664 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-ring-data-devices\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678699 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-combined-ca-bundle\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678769 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e521c27-9d67-47bc-b6ac-74fabb543d3f-etc-swift\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.678811 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r55zr\" (UniqueName: \"kubernetes.io/projected/9e521c27-9d67-47bc-b6ac-74fabb543d3f-kube-api-access-r55zr\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: E0321 05:05:51.680181 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:05:51 crc kubenswrapper[4775]: E0321 05:05:51.680222 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:05:51 crc kubenswrapper[4775]: E0321 05:05:51.680285 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift podName:8e93b938-c138-4cfc-a227-e1cd648ad59a nodeName:}" failed. No retries permitted until 2026-03-21 05:05:52.68026257 +0000 UTC m=+1105.656726264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift") pod "swift-storage-0" (UID: "8e93b938-c138-4cfc-a227-e1cd648ad59a") : configmap "swift-ring-files" not found Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.680647 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e521c27-9d67-47bc-b6ac-74fabb543d3f-etc-swift\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.681239 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-scripts\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.681239 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-ring-data-devices\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.683149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-swiftconf\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.683229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-dispersionconf\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.683711 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-combined-ca-bundle\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.694370 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r55zr\" (UniqueName: \"kubernetes.io/projected/9e521c27-9d67-47bc-b6ac-74fabb543d3f-kube-api-access-r55zr\") pod \"swift-ring-rebalance-kzll6\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:51 crc kubenswrapper[4775]: I0321 05:05:51.771263 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.214853 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"76d205b7-bc2e-4dad-b513-457ff20d67e1","Type":"ContainerStarted","Data":"d7da6ca9065a278564c662137d4b73030cfc2bcf91860ac5cc6099fc96832e02"} Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.224467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e","Type":"ContainerStarted","Data":"216f3fe3d5a6c74c85b4da2fb8fcc48f70ef16a89394a57a3d56a99fab9a5f54"} Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.229867 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kzll6"] Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.232507 4775 generic.go:334] "Generic (PLEG): container finished" podID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerID="d8e770cca74b17f5b344c07bee4bb08d1dc49f7e281098f9558394fc5dd7fa11" exitCode=0 Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.232604 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" event={"ID":"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5","Type":"ContainerDied","Data":"d8e770cca74b17f5b344c07bee4bb08d1dc49f7e281098f9558394fc5dd7fa11"} Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.232640 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" event={"ID":"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5","Type":"ContainerStarted","Data":"e02dcc00a6516fa9f13285030a0cc8babf7e17cec678d5d2033e37c7b73146b1"} Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.234911 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bade9789-f227-44ab-b7fa-2173445cd381","Type":"ContainerStarted","Data":"532bc505798ddd24619287c620e7ed5ad9ed287bde700c8c2a1b7198767d03e1"} Mar 21 05:05:52 crc kubenswrapper[4775]: W0321 05:05:52.242682 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e521c27_9d67_47bc_b6ac_74fabb543d3f.slice/crio-eb20ad310e635a1adb6bbb9a38d57d53e3b88c4783ef5987e437714596390e99 WatchSource:0}: Error finding container eb20ad310e635a1adb6bbb9a38d57d53e3b88c4783ef5987e437714596390e99: Status 404 returned error can't find the container with id eb20ad310e635a1adb6bbb9a38d57d53e3b88c4783ef5987e437714596390e99 Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.242812 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"93886182-fca2-42a9-a134-2243c7c7073d","Type":"ContainerStarted","Data":"ab36bf974063031155464ce2cec10f1d6d423f25084be3ae9828067d552ec991"} Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.278205 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.721696571 podStartE2EDuration="28.278187119s" podCreationTimestamp="2026-03-21 05:05:24 +0000 UTC" firstStartedPulling="2026-03-21 05:05:39.464091214 +0000 UTC m=+1092.440554838" lastFinishedPulling="2026-03-21 05:05:45.020581762 +0000 UTC m=+1097.997045386" observedRunningTime="2026-03-21 05:05:52.244740143 +0000 UTC m=+1105.221203787" watchObservedRunningTime="2026-03-21 05:05:52.278187119 +0000 UTC m=+1105.254650743" Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.281025 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.026247651 podStartE2EDuration="18.281014369s" podCreationTimestamp="2026-03-21 05:05:34 +0000 UTC" firstStartedPulling="2026-03-21 05:05:40.724806697 +0000 UTC m=+1093.701270321" lastFinishedPulling="2026-03-21 05:05:50.979573415 +0000 UTC m=+1103.956037039" observedRunningTime="2026-03-21 05:05:52.277542841 +0000 UTC m=+1105.254006465" watchObservedRunningTime="2026-03-21 05:05:52.281014369 +0000 UTC m=+1105.257477993" Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.345835 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.1223692 podStartE2EDuration="20.345819132s" podCreationTimestamp="2026-03-21 05:05:32 +0000 UTC" firstStartedPulling="2026-03-21 05:05:40.728838491 +0000 UTC m=+1093.705302115" lastFinishedPulling="2026-03-21 05:05:50.952288423 +0000 UTC m=+1103.928752047" observedRunningTime="2026-03-21 05:05:52.34541472 +0000 UTC m=+1105.321878334" watchObservedRunningTime="2026-03-21 05:05:52.345819132 +0000 UTC m=+1105.322282756" Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.373394 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.531953534 podStartE2EDuration="27.373375681s" podCreationTimestamp="2026-03-21 05:05:25 +0000 UTC" firstStartedPulling="2026-03-21 05:05:39.091499306 +0000 UTC m=+1092.067962930" lastFinishedPulling="2026-03-21 05:05:44.932921453 +0000 UTC m=+1097.909385077" observedRunningTime="2026-03-21 05:05:52.370843199 +0000 UTC m=+1105.347306843" watchObservedRunningTime="2026-03-21 05:05:52.373375681 +0000 UTC m=+1105.349839315" Mar 21 05:05:52 crc kubenswrapper[4775]: I0321 05:05:52.694431 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:52 crc kubenswrapper[4775]: E0321 05:05:52.694578 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:05:52 crc kubenswrapper[4775]: E0321 05:05:52.694592 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:05:52 crc kubenswrapper[4775]: E0321 05:05:52.694640 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift podName:8e93b938-c138-4cfc-a227-e1cd648ad59a nodeName:}" failed. No retries permitted until 2026-03-21 05:05:54.694626856 +0000 UTC m=+1107.671090480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift") pod "swift-storage-0" (UID: "8e93b938-c138-4cfc-a227-e1cd648ad59a") : configmap "swift-ring-files" not found Mar 21 05:05:53 crc kubenswrapper[4775]: I0321 05:05:53.253963 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" event={"ID":"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5","Type":"ContainerStarted","Data":"370a63757853aedcd7779b33f70117874887acb44e0ca8a8c4868f0ae046d389"} Mar 21 05:05:53 crc kubenswrapper[4775]: I0321 05:05:53.254335 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:05:53 crc kubenswrapper[4775]: I0321 05:05:53.257192 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzll6" event={"ID":"9e521c27-9d67-47bc-b6ac-74fabb543d3f","Type":"ContainerStarted","Data":"eb20ad310e635a1adb6bbb9a38d57d53e3b88c4783ef5987e437714596390e99"} Mar 21 05:05:53 crc kubenswrapper[4775]: I0321 05:05:53.285477 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" podStartSLOduration=4.285458245 podStartE2EDuration="4.285458245s" podCreationTimestamp="2026-03-21 05:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:05:53.282174792 +0000 UTC m=+1106.258638436" watchObservedRunningTime="2026-03-21 05:05:53.285458245 +0000 UTC m=+1106.261921869" Mar 21 05:05:53 crc kubenswrapper[4775]: I0321 05:05:53.672761 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:54 crc kubenswrapper[4775]: I0321 05:05:54.345451 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:54 crc kubenswrapper[4775]: I0321 05:05:54.390943 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:54 crc kubenswrapper[4775]: I0321 05:05:54.667780 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:54 crc kubenswrapper[4775]: I0321 05:05:54.708781 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:54 crc kubenswrapper[4775]: I0321 05:05:54.725439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:54 crc kubenswrapper[4775]: E0321 05:05:54.725627 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:05:54 crc kubenswrapper[4775]: E0321 05:05:54.725649 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:05:54 crc kubenswrapper[4775]: E0321 05:05:54.725705 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift podName:8e93b938-c138-4cfc-a227-e1cd648ad59a nodeName:}" failed. No retries permitted until 2026-03-21 05:05:58.725685357 +0000 UTC m=+1111.702148981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift") pod "swift-storage-0" (UID: "8e93b938-c138-4cfc-a227-e1cd648ad59a") : configmap "swift-ring-files" not found Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.281339 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.320775 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.323010 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.487124 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.488901 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fvhb4"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.489184 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerName="dnsmasq-dns" containerID="cri-o://370a63757853aedcd7779b33f70117874887acb44e0ca8a8c4868f0ae046d389" gracePeriod=10 Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.491820 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.515276 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-g8lds"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.518651 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.521989 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.555546 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-g8lds"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.618276 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ckh8b"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.619685 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.624155 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.638695 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ckh8b"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.686978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-dns-svc\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj9w2\" (UniqueName: \"kubernetes.io/projected/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-kube-api-access-gj9w2\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687086 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687107 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-ovn-rundir\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-combined-ca-bundle\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687188 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-ovs-rundir\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687230 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-config\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687280 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-config\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687308 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.687339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v4nh\" (UniqueName: \"kubernetes.io/projected/cfb65104-69e6-4820-baab-3b0eea2a38d7-kube-api-access-6v4nh\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.717789 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-g8lds"] Mar 21 05:05:55 crc kubenswrapper[4775]: E0321 05:05:55.720405 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-6v4nh ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-57d65f699f-g8lds" podUID="cfb65104-69e6-4820-baab-3b0eea2a38d7" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.755227 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-b4m98"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.756500 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.762838 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.764432 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-b4m98"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789231 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-config\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789290 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789328 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v4nh\" (UniqueName: \"kubernetes.io/projected/cfb65104-69e6-4820-baab-3b0eea2a38d7-kube-api-access-6v4nh\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789358 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltd4s\" (UniqueName: \"kubernetes.io/projected/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-kube-api-access-ltd4s\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-dns-svc\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789516 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj9w2\" (UniqueName: \"kubernetes.io/projected/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-kube-api-access-gj9w2\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-ovn-rundir\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-combined-ca-bundle\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789678 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-config\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-ovs-rundir\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.789985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-config\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.790890 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-config\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.792002 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-config\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.792550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-ovn-rundir\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.793105 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.793234 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-ovs-rundir\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.794031 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-dns-svc\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.798922 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-combined-ca-bundle\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.801778 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.803211 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.805868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.806479 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-phfns" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.806647 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.806705 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.806923 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.838680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj9w2\" (UniqueName: \"kubernetes.io/projected/9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8-kube-api-access-gj9w2\") pod \"ovn-controller-metrics-ckh8b\" (UID: \"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8\") " pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.838750 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.839295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v4nh\" (UniqueName: \"kubernetes.io/projected/cfb65104-69e6-4820-baab-3b0eea2a38d7-kube-api-access-6v4nh\") pod \"dnsmasq-dns-57d65f699f-g8lds\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891241 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-config\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891322 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891347 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891367 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/334a5c95-becc-4389-bb6f-50e5957cded6-scripts\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891392 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334a5c95-becc-4389-bb6f-50e5957cded6-config\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltd4s\" (UniqueName: \"kubernetes.io/projected/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-kube-api-access-ltd4s\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891454 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/334a5c95-becc-4389-bb6f-50e5957cded6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891620 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xn4q\" (UniqueName: \"kubernetes.io/projected/334a5c95-becc-4389-bb6f-50e5957cded6-kube-api-access-5xn4q\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891660 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.891694 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.892864 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-config\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.892899 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.892953 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.893638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.908916 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltd4s\" (UniqueName: \"kubernetes.io/projected/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-kube-api-access-ltd4s\") pod \"dnsmasq-dns-b8fbc5445-b4m98\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.945391 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ckh8b" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.992825 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.992891 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.992926 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.992952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/334a5c95-becc-4389-bb6f-50e5957cded6-scripts\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.992980 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334a5c95-becc-4389-bb6f-50e5957cded6-config\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.993015 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/334a5c95-becc-4389-bb6f-50e5957cded6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.993048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xn4q\" (UniqueName: \"kubernetes.io/projected/334a5c95-becc-4389-bb6f-50e5957cded6-kube-api-access-5xn4q\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.993781 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/334a5c95-becc-4389-bb6f-50e5957cded6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.994317 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/334a5c95-becc-4389-bb6f-50e5957cded6-scripts\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.994670 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/334a5c95-becc-4389-bb6f-50e5957cded6-config\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:55 crc kubenswrapper[4775]: I0321 05:05:55.996753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.000404 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.001797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334a5c95-becc-4389-bb6f-50e5957cded6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.011846 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xn4q\" (UniqueName: \"kubernetes.io/projected/334a5c95-becc-4389-bb6f-50e5957cded6-kube-api-access-5xn4q\") pod \"ovn-northd-0\" (UID: \"334a5c95-becc-4389-bb6f-50e5957cded6\") " pod="openstack/ovn-northd-0" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.093442 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.203838 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.290044 4775 generic.go:334] "Generic (PLEG): container finished" podID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerID="370a63757853aedcd7779b33f70117874887acb44e0ca8a8c4868f0ae046d389" exitCode=0 Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.290165 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" event={"ID":"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5","Type":"ContainerDied","Data":"370a63757853aedcd7779b33f70117874887acb44e0ca8a8c4868f0ae046d389"} Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.290254 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.299256 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.399835 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-config\") pod \"cfb65104-69e6-4820-baab-3b0eea2a38d7\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.399946 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v4nh\" (UniqueName: \"kubernetes.io/projected/cfb65104-69e6-4820-baab-3b0eea2a38d7-kube-api-access-6v4nh\") pod \"cfb65104-69e6-4820-baab-3b0eea2a38d7\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.400027 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-dns-svc\") pod \"cfb65104-69e6-4820-baab-3b0eea2a38d7\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.400051 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-ovsdbserver-nb\") pod \"cfb65104-69e6-4820-baab-3b0eea2a38d7\" (UID: \"cfb65104-69e6-4820-baab-3b0eea2a38d7\") " Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.400771 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-config" (OuterVolumeSpecName: "config") pod "cfb65104-69e6-4820-baab-3b0eea2a38d7" (UID: "cfb65104-69e6-4820-baab-3b0eea2a38d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.400784 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfb65104-69e6-4820-baab-3b0eea2a38d7" (UID: "cfb65104-69e6-4820-baab-3b0eea2a38d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.400809 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfb65104-69e6-4820-baab-3b0eea2a38d7" (UID: "cfb65104-69e6-4820-baab-3b0eea2a38d7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.401417 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.401438 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.401447 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb65104-69e6-4820-baab-3b0eea2a38d7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.404738 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb65104-69e6-4820-baab-3b0eea2a38d7-kube-api-access-6v4nh" (OuterVolumeSpecName: "kube-api-access-6v4nh") pod "cfb65104-69e6-4820-baab-3b0eea2a38d7" (UID: "cfb65104-69e6-4820-baab-3b0eea2a38d7"). InnerVolumeSpecName "kube-api-access-6v4nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.503423 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v4nh\" (UniqueName: \"kubernetes.io/projected/cfb65104-69e6-4820-baab-3b0eea2a38d7-kube-api-access-6v4nh\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.810879 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:56 crc kubenswrapper[4775]: I0321 05:05:56.810932 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:57 crc kubenswrapper[4775]: I0321 05:05:57.308881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-g8lds" Mar 21 05:05:57 crc kubenswrapper[4775]: I0321 05:05:57.376789 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-g8lds"] Mar 21 05:05:57 crc kubenswrapper[4775]: I0321 05:05:57.383867 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-g8lds"] Mar 21 05:05:57 crc kubenswrapper[4775]: I0321 05:05:57.675080 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb65104-69e6-4820-baab-3b0eea2a38d7" path="/var/lib/kubelet/pods/cfb65104-69e6-4820-baab-3b0eea2a38d7/volumes" Mar 21 05:05:58 crc kubenswrapper[4775]: I0321 05:05:58.739690 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:05:58 crc kubenswrapper[4775]: E0321 05:05:58.739990 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:05:58 crc kubenswrapper[4775]: E0321 05:05:58.740007 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:05:58 crc kubenswrapper[4775]: E0321 05:05:58.740054 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift podName:8e93b938-c138-4cfc-a227-e1cd648ad59a nodeName:}" failed. No retries permitted until 2026-03-21 05:06:06.740038835 +0000 UTC m=+1119.716502469 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift") pod "swift-storage-0" (UID: "8e93b938-c138-4cfc-a227-e1cd648ad59a") : configmap "swift-ring-files" not found Mar 21 05:05:58 crc kubenswrapper[4775]: I0321 05:05:58.949431 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:59 crc kubenswrapper[4775]: I0321 05:05:59.034569 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 21 05:05:59 crc kubenswrapper[4775]: I0321 05:05:59.583502 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.061612 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.158558 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567826-89d6v"] Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.162318 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-89d6v" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.165536 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.165816 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.170078 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.177848 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-89d6v"] Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.265101 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbtbc\" (UniqueName: \"kubernetes.io/projected/0c64d94e-917c-49b6-824b-0b2bdf9691ef-kube-api-access-wbtbc\") pod \"auto-csr-approver-29567826-89d6v\" (UID: \"0c64d94e-917c-49b6-824b-0b2bdf9691ef\") " pod="openshift-infra/auto-csr-approver-29567826-89d6v" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.367212 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbtbc\" (UniqueName: \"kubernetes.io/projected/0c64d94e-917c-49b6-824b-0b2bdf9691ef-kube-api-access-wbtbc\") pod \"auto-csr-approver-29567826-89d6v\" (UID: \"0c64d94e-917c-49b6-824b-0b2bdf9691ef\") " pod="openshift-infra/auto-csr-approver-29567826-89d6v" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.394656 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbtbc\" (UniqueName: \"kubernetes.io/projected/0c64d94e-917c-49b6-824b-0b2bdf9691ef-kube-api-access-wbtbc\") pod \"auto-csr-approver-29567826-89d6v\" (UID: \"0c64d94e-917c-49b6-824b-0b2bdf9691ef\") " pod="openshift-infra/auto-csr-approver-29567826-89d6v" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.463979 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-b4m98"] Mar 21 05:06:00 crc kubenswrapper[4775]: W0321 05:06:00.468330 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b50afbd_31b8_40ff_bd7b_1ce5021e2837.slice/crio-2e772d1abd813f0f52a871bbd2aa2afc7b674d5369d7a3240dceb79e9ea9149f WatchSource:0}: Error finding container 2e772d1abd813f0f52a871bbd2aa2afc7b674d5369d7a3240dceb79e9ea9149f: Status 404 returned error can't find the container with id 2e772d1abd813f0f52a871bbd2aa2afc7b674d5369d7a3240dceb79e9ea9149f Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.470615 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.481646 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-89d6v" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.552434 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.608745 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ckh8b"] Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.673526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-dns-svc\") pod \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.673892 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-config\") pod \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.674021 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zcdx\" (UniqueName: \"kubernetes.io/projected/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-kube-api-access-8zcdx\") pod \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\" (UID: \"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5\") " Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.680609 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-kube-api-access-8zcdx" (OuterVolumeSpecName: "kube-api-access-8zcdx") pod "82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" (UID: "82af84d9-ff4c-4c51-9b1c-62f1c74b7de5"). InnerVolumeSpecName "kube-api-access-8zcdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.756818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" (UID: "82af84d9-ff4c-4c51-9b1c-62f1c74b7de5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.759716 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-config" (OuterVolumeSpecName: "config") pod "82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" (UID: "82af84d9-ff4c-4c51-9b1c-62f1c74b7de5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.776109 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zcdx\" (UniqueName: \"kubernetes.io/projected/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-kube-api-access-8zcdx\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.776154 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.776164 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:00 crc kubenswrapper[4775]: E0321 05:06:00.915989 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b50afbd_31b8_40ff_bd7b_1ce5021e2837.slice/crio-5d8a4462c4906d47c3863ff44ffe3802ba03cf186cd035ba4999f12c54eb2a8b.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:06:00 crc kubenswrapper[4775]: W0321 05:06:00.993964 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c64d94e_917c_49b6_824b_0b2bdf9691ef.slice/crio-70eac75c6e7292f7ac2d4dacfbd4b1082cd3e02756525e9a961e590c8b4fc563 WatchSource:0}: Error finding container 70eac75c6e7292f7ac2d4dacfbd4b1082cd3e02756525e9a961e590c8b4fc563: Status 404 returned error can't find the container with id 70eac75c6e7292f7ac2d4dacfbd4b1082cd3e02756525e9a961e590c8b4fc563 Mar 21 05:06:00 crc kubenswrapper[4775]: I0321 05:06:00.995263 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-89d6v"] Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.340171 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" event={"ID":"82af84d9-ff4c-4c51-9b1c-62f1c74b7de5","Type":"ContainerDied","Data":"e02dcc00a6516fa9f13285030a0cc8babf7e17cec678d5d2033e37c7b73146b1"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.340193 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fvhb4" Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.340442 4775 scope.go:117] "RemoveContainer" containerID="370a63757853aedcd7779b33f70117874887acb44e0ca8a8c4868f0ae046d389" Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.341466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-89d6v" event={"ID":"0c64d94e-917c-49b6-824b-0b2bdf9691ef","Type":"ContainerStarted","Data":"70eac75c6e7292f7ac2d4dacfbd4b1082cd3e02756525e9a961e590c8b4fc563"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.343085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ckh8b" event={"ID":"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8","Type":"ContainerStarted","Data":"1986b1ca4e6f2dfa09df73f55b760d00a5f511d18e0eba6fe1c0b623e4d746d3"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.343141 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ckh8b" event={"ID":"9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8","Type":"ContainerStarted","Data":"fef7f5ca2bc3d18af87381e34cb9dcefad0b96fc30f5bf7aace6bf0f4f58d596"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.345341 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzll6" event={"ID":"9e521c27-9d67-47bc-b6ac-74fabb543d3f","Type":"ContainerStarted","Data":"449084f0366fca2d37734d9abfa93601028eb223fcd5ecd49cab997851f8fae3"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.346638 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerID="5d8a4462c4906d47c3863ff44ffe3802ba03cf186cd035ba4999f12c54eb2a8b" exitCode=0 Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.346677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" event={"ID":"6b50afbd-31b8-40ff-bd7b-1ce5021e2837","Type":"ContainerDied","Data":"5d8a4462c4906d47c3863ff44ffe3802ba03cf186cd035ba4999f12c54eb2a8b"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.346691 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" event={"ID":"6b50afbd-31b8-40ff-bd7b-1ce5021e2837","Type":"ContainerStarted","Data":"2e772d1abd813f0f52a871bbd2aa2afc7b674d5369d7a3240dceb79e9ea9149f"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.349016 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"334a5c95-becc-4389-bb6f-50e5957cded6","Type":"ContainerStarted","Data":"7693c5ea17ce78ac391cf46e868c97b34eed5f12803f81339f6dbbb8948bc8c1"} Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.362965 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ckh8b" podStartSLOduration=6.362948562 podStartE2EDuration="6.362948562s" podCreationTimestamp="2026-03-21 05:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:01.362264503 +0000 UTC m=+1114.338728117" watchObservedRunningTime="2026-03-21 05:06:01.362948562 +0000 UTC m=+1114.339412186" Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.431184 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kzll6" podStartSLOduration=2.1255120610000002 podStartE2EDuration="10.431162151s" podCreationTimestamp="2026-03-21 05:05:51 +0000 UTC" firstStartedPulling="2026-03-21 05:05:52.258537513 +0000 UTC m=+1105.235001137" lastFinishedPulling="2026-03-21 05:06:00.564187603 +0000 UTC m=+1113.540651227" observedRunningTime="2026-03-21 05:06:01.423990018 +0000 UTC m=+1114.400453642" watchObservedRunningTime="2026-03-21 05:06:01.431162151 +0000 UTC m=+1114.407625775" Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.441566 4775 scope.go:117] "RemoveContainer" containerID="d8e770cca74b17f5b344c07bee4bb08d1dc49f7e281098f9558394fc5dd7fa11" Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.451589 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fvhb4"] Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.462654 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fvhb4"] Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.660765 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.674287 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" path="/var/lib/kubelet/pods/82af84d9-ff4c-4c51-9b1c-62f1c74b7de5/volumes" Mar 21 05:06:01 crc kubenswrapper[4775]: I0321 05:06:01.736354 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.368877 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" event={"ID":"6b50afbd-31b8-40ff-bd7b-1ce5021e2837","Type":"ContainerStarted","Data":"6c721bd1cf19d6ad1fc2eb055e74ccf9afea13852ad17eeb02e87ea598c42465"} Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.369039 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.370392 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"334a5c95-becc-4389-bb6f-50e5957cded6","Type":"ContainerStarted","Data":"9c5152fe0933c9b96daebbc84ec37854c159baf333677ab793d9a9af111e7842"} Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.370409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"334a5c95-becc-4389-bb6f-50e5957cded6","Type":"ContainerStarted","Data":"d667e95ad37e5a372c98b7d3347afd11c23f24668a2941bd2a8ef2bcab231965"} Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.370667 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.373579 4775 generic.go:334] "Generic (PLEG): container finished" podID="0c64d94e-917c-49b6-824b-0b2bdf9691ef" containerID="2acf177da0f2ed7ac0606a427af701da980e32ea57fc8b3636ddd64f2a8a5536" exitCode=0 Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.373682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-89d6v" event={"ID":"0c64d94e-917c-49b6-824b-0b2bdf9691ef","Type":"ContainerDied","Data":"2acf177da0f2ed7ac0606a427af701da980e32ea57fc8b3636ddd64f2a8a5536"} Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.389461 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" podStartSLOduration=7.389399521 podStartE2EDuration="7.389399521s" podCreationTimestamp="2026-03-21 05:05:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:02.385542431 +0000 UTC m=+1115.362006045" watchObservedRunningTime="2026-03-21 05:06:02.389399521 +0000 UTC m=+1115.365863145" Mar 21 05:06:02 crc kubenswrapper[4775]: I0321 05:06:02.411850 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=6.392096286 podStartE2EDuration="7.411828115s" podCreationTimestamp="2026-03-21 05:05:55 +0000 UTC" firstStartedPulling="2026-03-21 05:06:00.475676899 +0000 UTC m=+1113.452140523" lastFinishedPulling="2026-03-21 05:06:01.495408728 +0000 UTC m=+1114.471872352" observedRunningTime="2026-03-21 05:06:02.406763712 +0000 UTC m=+1115.383227336" watchObservedRunningTime="2026-03-21 05:06:02.411828115 +0000 UTC m=+1115.388291739" Mar 21 05:06:03 crc kubenswrapper[4775]: I0321 05:06:03.945725 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-89d6v" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.058517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbtbc\" (UniqueName: \"kubernetes.io/projected/0c64d94e-917c-49b6-824b-0b2bdf9691ef-kube-api-access-wbtbc\") pod \"0c64d94e-917c-49b6-824b-0b2bdf9691ef\" (UID: \"0c64d94e-917c-49b6-824b-0b2bdf9691ef\") " Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.064796 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c64d94e-917c-49b6-824b-0b2bdf9691ef-kube-api-access-wbtbc" (OuterVolumeSpecName: "kube-api-access-wbtbc") pod "0c64d94e-917c-49b6-824b-0b2bdf9691ef" (UID: "0c64d94e-917c-49b6-824b-0b2bdf9691ef"). InnerVolumeSpecName "kube-api-access-wbtbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.160887 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbtbc\" (UniqueName: \"kubernetes.io/projected/0c64d94e-917c-49b6-824b-0b2bdf9691ef-kube-api-access-wbtbc\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.162738 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tq6zw"] Mar 21 05:06:04 crc kubenswrapper[4775]: E0321 05:06:04.163104 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c64d94e-917c-49b6-824b-0b2bdf9691ef" containerName="oc" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.163143 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c64d94e-917c-49b6-824b-0b2bdf9691ef" containerName="oc" Mar 21 05:06:04 crc kubenswrapper[4775]: E0321 05:06:04.163170 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerName="init" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.163178 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerName="init" Mar 21 05:06:04 crc kubenswrapper[4775]: E0321 05:06:04.163204 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerName="dnsmasq-dns" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.163212 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerName="dnsmasq-dns" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.163401 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="82af84d9-ff4c-4c51-9b1c-62f1c74b7de5" containerName="dnsmasq-dns" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.163423 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c64d94e-917c-49b6-824b-0b2bdf9691ef" containerName="oc" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.163949 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.165590 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.180858 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tq6zw"] Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.262221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1531c7-e7a2-45cd-a5b9-9678c4808315-operator-scripts\") pod \"root-account-create-update-tq6zw\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.262285 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rn42\" (UniqueName: \"kubernetes.io/projected/af1531c7-e7a2-45cd-a5b9-9678c4808315-kube-api-access-4rn42\") pod \"root-account-create-update-tq6zw\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.364331 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1531c7-e7a2-45cd-a5b9-9678c4808315-operator-scripts\") pod \"root-account-create-update-tq6zw\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.364412 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn42\" (UniqueName: \"kubernetes.io/projected/af1531c7-e7a2-45cd-a5b9-9678c4808315-kube-api-access-4rn42\") pod \"root-account-create-update-tq6zw\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.364998 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1531c7-e7a2-45cd-a5b9-9678c4808315-operator-scripts\") pod \"root-account-create-update-tq6zw\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.385009 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rn42\" (UniqueName: \"kubernetes.io/projected/af1531c7-e7a2-45cd-a5b9-9678c4808315-kube-api-access-4rn42\") pod \"root-account-create-update-tq6zw\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.484140 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.581136 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-89d6v" event={"ID":"0c64d94e-917c-49b6-824b-0b2bdf9691ef","Type":"ContainerDied","Data":"70eac75c6e7292f7ac2d4dacfbd4b1082cd3e02756525e9a961e590c8b4fc563"} Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.581173 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-89d6v" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.581183 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70eac75c6e7292f7ac2d4dacfbd4b1082cd3e02756525e9a961e590c8b4fc563" Mar 21 05:06:04 crc kubenswrapper[4775]: I0321 05:06:04.949464 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tq6zw"] Mar 21 05:06:05 crc kubenswrapper[4775]: I0321 05:06:05.018678 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-cngbx"] Mar 21 05:06:05 crc kubenswrapper[4775]: I0321 05:06:05.027913 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-cngbx"] Mar 21 05:06:05 crc kubenswrapper[4775]: I0321 05:06:05.592532 4775 generic.go:334] "Generic (PLEG): container finished" podID="af1531c7-e7a2-45cd-a5b9-9678c4808315" containerID="8177bcef8f42975589ec3469527727473adafaa3cc6210ffa908111b0912b0d5" exitCode=0 Mar 21 05:06:05 crc kubenswrapper[4775]: I0321 05:06:05.592740 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tq6zw" event={"ID":"af1531c7-e7a2-45cd-a5b9-9678c4808315","Type":"ContainerDied","Data":"8177bcef8f42975589ec3469527727473adafaa3cc6210ffa908111b0912b0d5"} Mar 21 05:06:05 crc kubenswrapper[4775]: I0321 05:06:05.592821 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tq6zw" event={"ID":"af1531c7-e7a2-45cd-a5b9-9678c4808315","Type":"ContainerStarted","Data":"490b9754f17abc231dc24b2f80cf9533e36998bd95e31c2f91871cb4975c2c6c"} Mar 21 05:06:05 crc kubenswrapper[4775]: I0321 05:06:05.677841 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3" path="/var/lib/kubelet/pods/1a4faca2-a8cb-41b5-a0cf-44be1cea8fd3/volumes" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.096772 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.163549 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mn5xb"] Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.163876 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" podUID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerName="dnsmasq-dns" containerID="cri-o://9a3778eec58b097b77fc9e80249e34ac71c57ec91481179476b24785158c24e3" gracePeriod=10 Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.601462 4775 generic.go:334] "Generic (PLEG): container finished" podID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerID="9a3778eec58b097b77fc9e80249e34ac71c57ec91481179476b24785158c24e3" exitCode=0 Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.601558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" event={"ID":"2084b413-8c86-4cc0-89ce-0dbfc4049e9b","Type":"ContainerDied","Data":"9a3778eec58b097b77fc9e80249e34ac71c57ec91481179476b24785158c24e3"} Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.601985 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" event={"ID":"2084b413-8c86-4cc0-89ce-0dbfc4049e9b","Type":"ContainerDied","Data":"04459e09511e8f218103b62e1cd093e2c1d10ce88498abbdd40b243c9345ef58"} Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.602003 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04459e09511e8f218103b62e1cd093e2c1d10ce88498abbdd40b243c9345ef58" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.640020 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.712375 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f77fw\" (UniqueName: \"kubernetes.io/projected/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-kube-api-access-f77fw\") pod \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.712477 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-config\") pod \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.713488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-dns-svc\") pod \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\" (UID: \"2084b413-8c86-4cc0-89ce-0dbfc4049e9b\") " Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.730604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-kube-api-access-f77fw" (OuterVolumeSpecName: "kube-api-access-f77fw") pod "2084b413-8c86-4cc0-89ce-0dbfc4049e9b" (UID: "2084b413-8c86-4cc0-89ce-0dbfc4049e9b"). InnerVolumeSpecName "kube-api-access-f77fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.753707 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2084b413-8c86-4cc0-89ce-0dbfc4049e9b" (UID: "2084b413-8c86-4cc0-89ce-0dbfc4049e9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.785127 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-config" (OuterVolumeSpecName: "config") pod "2084b413-8c86-4cc0-89ce-0dbfc4049e9b" (UID: "2084b413-8c86-4cc0-89ce-0dbfc4049e9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.816176 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.816370 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.816380 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.816392 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f77fw\" (UniqueName: \"kubernetes.io/projected/2084b413-8c86-4cc0-89ce-0dbfc4049e9b-kube-api-access-f77fw\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:06 crc kubenswrapper[4775]: E0321 05:06:06.816554 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 05:06:06 crc kubenswrapper[4775]: E0321 05:06:06.816588 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 05:06:06 crc kubenswrapper[4775]: E0321 05:06:06.816639 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift podName:8e93b938-c138-4cfc-a227-e1cd648ad59a nodeName:}" failed. No retries permitted until 2026-03-21 05:06:22.816621463 +0000 UTC m=+1135.793085087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift") pod "swift-storage-0" (UID: "8e93b938-c138-4cfc-a227-e1cd648ad59a") : configmap "swift-ring-files" not found Mar 21 05:06:06 crc kubenswrapper[4775]: I0321 05:06:06.867081 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.038679 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rn42\" (UniqueName: \"kubernetes.io/projected/af1531c7-e7a2-45cd-a5b9-9678c4808315-kube-api-access-4rn42\") pod \"af1531c7-e7a2-45cd-a5b9-9678c4808315\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.038759 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1531c7-e7a2-45cd-a5b9-9678c4808315-operator-scripts\") pod \"af1531c7-e7a2-45cd-a5b9-9678c4808315\" (UID: \"af1531c7-e7a2-45cd-a5b9-9678c4808315\") " Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.039616 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1531c7-e7a2-45cd-a5b9-9678c4808315-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af1531c7-e7a2-45cd-a5b9-9678c4808315" (UID: "af1531c7-e7a2-45cd-a5b9-9678c4808315"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.042716 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1531c7-e7a2-45cd-a5b9-9678c4808315-kube-api-access-4rn42" (OuterVolumeSpecName: "kube-api-access-4rn42") pod "af1531c7-e7a2-45cd-a5b9-9678c4808315" (UID: "af1531c7-e7a2-45cd-a5b9-9678c4808315"). InnerVolumeSpecName "kube-api-access-4rn42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.141039 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rn42\" (UniqueName: \"kubernetes.io/projected/af1531c7-e7a2-45cd-a5b9-9678c4808315-kube-api-access-4rn42\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.142250 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1531c7-e7a2-45cd-a5b9-9678c4808315-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.338413 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-fmfms"] Mar 21 05:06:07 crc kubenswrapper[4775]: E0321 05:06:07.338780 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerName="dnsmasq-dns" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.338801 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerName="dnsmasq-dns" Mar 21 05:06:07 crc kubenswrapper[4775]: E0321 05:06:07.338827 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1531c7-e7a2-45cd-a5b9-9678c4808315" containerName="mariadb-account-create-update" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.338835 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1531c7-e7a2-45cd-a5b9-9678c4808315" containerName="mariadb-account-create-update" Mar 21 05:06:07 crc kubenswrapper[4775]: E0321 05:06:07.338852 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerName="init" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.338859 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerName="init" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.339001 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" containerName="dnsmasq-dns" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.339015 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1531c7-e7a2-45cd-a5b9-9678c4808315" containerName="mariadb-account-create-update" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.339530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.353200 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fmfms"] Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.446680 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a151b65-5187-48d2-a29a-30eae42c179c-operator-scripts\") pod \"glance-db-create-fmfms\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.446731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8kbf\" (UniqueName: \"kubernetes.io/projected/3a151b65-5187-48d2-a29a-30eae42c179c-kube-api-access-w8kbf\") pod \"glance-db-create-fmfms\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.448819 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6de1-account-create-update-bkzxl"] Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.450277 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.454025 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.457848 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6de1-account-create-update-bkzxl"] Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.548899 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a151b65-5187-48d2-a29a-30eae42c179c-operator-scripts\") pod \"glance-db-create-fmfms\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.548946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c83c73-8982-4c8e-bdeb-204a4874162c-operator-scripts\") pod \"glance-6de1-account-create-update-bkzxl\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.548985 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5pph\" (UniqueName: \"kubernetes.io/projected/a8c83c73-8982-4c8e-bdeb-204a4874162c-kube-api-access-b5pph\") pod \"glance-6de1-account-create-update-bkzxl\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.549020 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8kbf\" (UniqueName: \"kubernetes.io/projected/3a151b65-5187-48d2-a29a-30eae42c179c-kube-api-access-w8kbf\") pod \"glance-db-create-fmfms\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.549858 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a151b65-5187-48d2-a29a-30eae42c179c-operator-scripts\") pod \"glance-db-create-fmfms\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.564789 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8kbf\" (UniqueName: \"kubernetes.io/projected/3a151b65-5187-48d2-a29a-30eae42c179c-kube-api-access-w8kbf\") pod \"glance-db-create-fmfms\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.612244 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tq6zw" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.612244 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tq6zw" event={"ID":"af1531c7-e7a2-45cd-a5b9-9678c4808315","Type":"ContainerDied","Data":"490b9754f17abc231dc24b2f80cf9533e36998bd95e31c2f91871cb4975c2c6c"} Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.612255 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-mn5xb" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.612322 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490b9754f17abc231dc24b2f80cf9533e36998bd95e31c2f91871cb4975c2c6c" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.650899 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c83c73-8982-4c8e-bdeb-204a4874162c-operator-scripts\") pod \"glance-6de1-account-create-update-bkzxl\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.650945 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5pph\" (UniqueName: \"kubernetes.io/projected/a8c83c73-8982-4c8e-bdeb-204a4874162c-kube-api-access-b5pph\") pod \"glance-6de1-account-create-update-bkzxl\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.651970 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c83c73-8982-4c8e-bdeb-204a4874162c-operator-scripts\") pod \"glance-6de1-account-create-update-bkzxl\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.652402 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mn5xb"] Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.656402 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fmfms" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.658539 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-mn5xb"] Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.673826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5pph\" (UniqueName: \"kubernetes.io/projected/a8c83c73-8982-4c8e-bdeb-204a4874162c-kube-api-access-b5pph\") pod \"glance-6de1-account-create-update-bkzxl\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.675288 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2084b413-8c86-4cc0-89ce-0dbfc4049e9b" path="/var/lib/kubelet/pods/2084b413-8c86-4cc0-89ce-0dbfc4049e9b/volumes" Mar 21 05:06:07 crc kubenswrapper[4775]: I0321 05:06:07.763651 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.130748 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fmfms"] Mar 21 05:06:08 crc kubenswrapper[4775]: W0321 05:06:08.132197 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a151b65_5187_48d2_a29a_30eae42c179c.slice/crio-eed09522f9eaa338fb2dacfffe92955dd1a12162b8fc1fd1c6361c9bf7b0db23 WatchSource:0}: Error finding container eed09522f9eaa338fb2dacfffe92955dd1a12162b8fc1fd1c6361c9bf7b0db23: Status 404 returned error can't find the container with id eed09522f9eaa338fb2dacfffe92955dd1a12162b8fc1fd1c6361c9bf7b0db23 Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.211657 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6de1-account-create-update-bkzxl"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.243034 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-j56nc"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.244428 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.256901 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j56nc"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.305524 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-594f-account-create-update-9ntmw"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.306445 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.309033 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.314350 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-594f-account-create-update-9ntmw"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.367043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lm6t\" (UniqueName: \"kubernetes.io/projected/17e15fc9-dccb-44f6-9266-2bfe23d9e224-kube-api-access-6lm6t\") pod \"keystone-db-create-j56nc\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.367350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e15fc9-dccb-44f6-9266-2bfe23d9e224-operator-scripts\") pod \"keystone-db-create-j56nc\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.419719 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-429fj"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.421166 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.428806 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-429fj"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.468599 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lm6t\" (UniqueName: \"kubernetes.io/projected/17e15fc9-dccb-44f6-9266-2bfe23d9e224-kube-api-access-6lm6t\") pod \"keystone-db-create-j56nc\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.468728 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbh4h\" (UniqueName: \"kubernetes.io/projected/88db3052-d0c3-4f00-a116-aea162a7790b-kube-api-access-tbh4h\") pod \"keystone-594f-account-create-update-9ntmw\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.468766 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e15fc9-dccb-44f6-9266-2bfe23d9e224-operator-scripts\") pod \"keystone-db-create-j56nc\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.468829 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88db3052-d0c3-4f00-a116-aea162a7790b-operator-scripts\") pod \"keystone-594f-account-create-update-9ntmw\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.469675 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e15fc9-dccb-44f6-9266-2bfe23d9e224-operator-scripts\") pod \"keystone-db-create-j56nc\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.489484 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lm6t\" (UniqueName: \"kubernetes.io/projected/17e15fc9-dccb-44f6-9266-2bfe23d9e224-kube-api-access-6lm6t\") pod \"keystone-db-create-j56nc\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.551744 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-41dd-account-create-update-5ltn6"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.552713 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.555198 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.563688 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-41dd-account-create-update-5ltn6"] Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.569684 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92b4f\" (UniqueName: \"kubernetes.io/projected/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-kube-api-access-92b4f\") pod \"placement-db-create-429fj\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.569737 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbh4h\" (UniqueName: \"kubernetes.io/projected/88db3052-d0c3-4f00-a116-aea162a7790b-kube-api-access-tbh4h\") pod \"keystone-594f-account-create-update-9ntmw\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.569882 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88db3052-d0c3-4f00-a116-aea162a7790b-operator-scripts\") pod \"keystone-594f-account-create-update-9ntmw\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.570704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88db3052-d0c3-4f00-a116-aea162a7790b-operator-scripts\") pod \"keystone-594f-account-create-update-9ntmw\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.570800 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-operator-scripts\") pod \"placement-db-create-429fj\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.585381 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbh4h\" (UniqueName: \"kubernetes.io/projected/88db3052-d0c3-4f00-a116-aea162a7790b-kube-api-access-tbh4h\") pod \"keystone-594f-account-create-update-9ntmw\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.621748 4775 generic.go:334] "Generic (PLEG): container finished" podID="9e521c27-9d67-47bc-b6ac-74fabb543d3f" containerID="449084f0366fca2d37734d9abfa93601028eb223fcd5ecd49cab997851f8fae3" exitCode=0 Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.621810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzll6" event={"ID":"9e521c27-9d67-47bc-b6ac-74fabb543d3f","Type":"ContainerDied","Data":"449084f0366fca2d37734d9abfa93601028eb223fcd5ecd49cab997851f8fae3"} Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.625014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fmfms" event={"ID":"3a151b65-5187-48d2-a29a-30eae42c179c","Type":"ContainerStarted","Data":"f3bbbe70d0549a7acec5479ed8da7aa8abe15ef1981c7636975b0a1dd52d35cc"} Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.625074 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fmfms" event={"ID":"3a151b65-5187-48d2-a29a-30eae42c179c","Type":"ContainerStarted","Data":"eed09522f9eaa338fb2dacfffe92955dd1a12162b8fc1fd1c6361c9bf7b0db23"} Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.626360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6de1-account-create-update-bkzxl" event={"ID":"a8c83c73-8982-4c8e-bdeb-204a4874162c","Type":"ContainerStarted","Data":"e4255198ffff3f324a60a06814346bd92cba238dd6b295fce65cd1387f10c2d6"} Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.626386 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6de1-account-create-update-bkzxl" event={"ID":"a8c83c73-8982-4c8e-bdeb-204a4874162c","Type":"ContainerStarted","Data":"51e33874e51c1b20e7b3774c500e497978209cb064fcdfc29426e7e72f1a0361"} Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.628387 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.659084 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-fmfms" podStartSLOduration=1.659063528 podStartE2EDuration="1.659063528s" podCreationTimestamp="2026-03-21 05:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:08.646451091 +0000 UTC m=+1121.622914715" watchObservedRunningTime="2026-03-21 05:06:08.659063528 +0000 UTC m=+1121.635527152" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.663107 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.667587 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-6de1-account-create-update-bkzxl" podStartSLOduration=1.667574519 podStartE2EDuration="1.667574519s" podCreationTimestamp="2026-03-21 05:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:08.658865762 +0000 UTC m=+1121.635329406" watchObservedRunningTime="2026-03-21 05:06:08.667574519 +0000 UTC m=+1121.644038143" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.671714 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8wxz\" (UniqueName: \"kubernetes.io/projected/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-kube-api-access-h8wxz\") pod \"placement-41dd-account-create-update-5ltn6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.671779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-operator-scripts\") pod \"placement-41dd-account-create-update-5ltn6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.671851 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-operator-scripts\") pod \"placement-db-create-429fj\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.671943 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92b4f\" (UniqueName: \"kubernetes.io/projected/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-kube-api-access-92b4f\") pod \"placement-db-create-429fj\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.673195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-operator-scripts\") pod \"placement-db-create-429fj\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.692385 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92b4f\" (UniqueName: \"kubernetes.io/projected/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-kube-api-access-92b4f\") pod \"placement-db-create-429fj\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.747185 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-429fj" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.773434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8wxz\" (UniqueName: \"kubernetes.io/projected/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-kube-api-access-h8wxz\") pod \"placement-41dd-account-create-update-5ltn6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.773498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-operator-scripts\") pod \"placement-41dd-account-create-update-5ltn6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.775510 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-operator-scripts\") pod \"placement-41dd-account-create-update-5ltn6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.790788 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8wxz\" (UniqueName: \"kubernetes.io/projected/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-kube-api-access-h8wxz\") pod \"placement-41dd-account-create-update-5ltn6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:08 crc kubenswrapper[4775]: I0321 05:06:08.921676 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.100018 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j56nc"] Mar 21 05:06:09 crc kubenswrapper[4775]: W0321 05:06:09.100302 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17e15fc9_dccb_44f6_9266_2bfe23d9e224.slice/crio-c91792969a9636707c8b5d363c9f76819e9d8b1a4df038d739db479fdd4b1ae0 WatchSource:0}: Error finding container c91792969a9636707c8b5d363c9f76819e9d8b1a4df038d739db479fdd4b1ae0: Status 404 returned error can't find the container with id c91792969a9636707c8b5d363c9f76819e9d8b1a4df038d739db479fdd4b1ae0 Mar 21 05:06:09 crc kubenswrapper[4775]: W0321 05:06:09.101731 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88db3052_d0c3_4f00_a116_aea162a7790b.slice/crio-b730e73ba2321274902ab7bcdcdddf4bc746dc375617551c9d0547ea06be886b WatchSource:0}: Error finding container b730e73ba2321274902ab7bcdcdddf4bc746dc375617551c9d0547ea06be886b: Status 404 returned error can't find the container with id b730e73ba2321274902ab7bcdcdddf4bc746dc375617551c9d0547ea06be886b Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.107735 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-594f-account-create-update-9ntmw"] Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.228009 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-429fj"] Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.351785 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-41dd-account-create-update-5ltn6"] Mar 21 05:06:09 crc kubenswrapper[4775]: W0321 05:06:09.358581 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1623bfa5_c196_4b3f_b1ad_2e895cb6e6d6.slice/crio-fc201b1ec5f6a9dfa6006cd5ad3ef839350e233ae22e75cc42fb3c042862542a WatchSource:0}: Error finding container fc201b1ec5f6a9dfa6006cd5ad3ef839350e233ae22e75cc42fb3c042862542a: Status 404 returned error can't find the container with id fc201b1ec5f6a9dfa6006cd5ad3ef839350e233ae22e75cc42fb3c042862542a Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.635209 4775 generic.go:334] "Generic (PLEG): container finished" podID="a8c83c73-8982-4c8e-bdeb-204a4874162c" containerID="e4255198ffff3f324a60a06814346bd92cba238dd6b295fce65cd1387f10c2d6" exitCode=0 Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.635467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6de1-account-create-update-bkzxl" event={"ID":"a8c83c73-8982-4c8e-bdeb-204a4874162c","Type":"ContainerDied","Data":"e4255198ffff3f324a60a06814346bd92cba238dd6b295fce65cd1387f10c2d6"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.636901 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-41dd-account-create-update-5ltn6" event={"ID":"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6","Type":"ContainerStarted","Data":"05fed25f10a4568226dd529e50e8f49b33dd0e98b0199020ea806a0cb77f54ee"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.637006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-41dd-account-create-update-5ltn6" event={"ID":"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6","Type":"ContainerStarted","Data":"fc201b1ec5f6a9dfa6006cd5ad3ef839350e233ae22e75cc42fb3c042862542a"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.638578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j56nc" event={"ID":"17e15fc9-dccb-44f6-9266-2bfe23d9e224","Type":"ContainerStarted","Data":"13179901d0030c432b15b9cdcd3d4bcc8af1fc1231684497164b45d09b84bafe"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.638626 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j56nc" event={"ID":"17e15fc9-dccb-44f6-9266-2bfe23d9e224","Type":"ContainerStarted","Data":"c91792969a9636707c8b5d363c9f76819e9d8b1a4df038d739db479fdd4b1ae0"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.640041 4775 generic.go:334] "Generic (PLEG): container finished" podID="3a151b65-5187-48d2-a29a-30eae42c179c" containerID="f3bbbe70d0549a7acec5479ed8da7aa8abe15ef1981c7636975b0a1dd52d35cc" exitCode=0 Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.640128 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fmfms" event={"ID":"3a151b65-5187-48d2-a29a-30eae42c179c","Type":"ContainerDied","Data":"f3bbbe70d0549a7acec5479ed8da7aa8abe15ef1981c7636975b0a1dd52d35cc"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.641783 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-429fj" event={"ID":"a5ecd8b9-b671-4181-96e9-adf3e820c8c7","Type":"ContainerStarted","Data":"a887c3f98f7514e80429067de51d11f1272a6c7ef6bbe82b5e6e6de5b394be30"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.641806 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-429fj" event={"ID":"a5ecd8b9-b671-4181-96e9-adf3e820c8c7","Type":"ContainerStarted","Data":"54d2934244670d8ac6794aed279432efac3b71d073195ae0ffbaf75a83df79af"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.647011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594f-account-create-update-9ntmw" event={"ID":"88db3052-d0c3-4f00-a116-aea162a7790b","Type":"ContainerStarted","Data":"2b7a69f7824c3af4bbbbe389045d25783f39861d479383b702bdac257cb7b2e3"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.647067 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594f-account-create-update-9ntmw" event={"ID":"88db3052-d0c3-4f00-a116-aea162a7790b","Type":"ContainerStarted","Data":"b730e73ba2321274902ab7bcdcdddf4bc746dc375617551c9d0547ea06be886b"} Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.679074 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-429fj" podStartSLOduration=1.679054453 podStartE2EDuration="1.679054453s" podCreationTimestamp="2026-03-21 05:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:09.672164808 +0000 UTC m=+1122.648628432" watchObservedRunningTime="2026-03-21 05:06:09.679054453 +0000 UTC m=+1122.655518077" Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.692629 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-594f-account-create-update-9ntmw" podStartSLOduration=1.692609006 podStartE2EDuration="1.692609006s" podCreationTimestamp="2026-03-21 05:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:09.689713644 +0000 UTC m=+1122.666177268" watchObservedRunningTime="2026-03-21 05:06:09.692609006 +0000 UTC m=+1122.669072630" Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.719002 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-j56nc" podStartSLOduration=1.718981142 podStartE2EDuration="1.718981142s" podCreationTimestamp="2026-03-21 05:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:09.710257615 +0000 UTC m=+1122.686721259" watchObservedRunningTime="2026-03-21 05:06:09.718981142 +0000 UTC m=+1122.695444766" Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.730904 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-41dd-account-create-update-5ltn6" podStartSLOduration=1.730882509 podStartE2EDuration="1.730882509s" podCreationTimestamp="2026-03-21 05:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:09.72387099 +0000 UTC m=+1122.700334634" watchObservedRunningTime="2026-03-21 05:06:09.730882509 +0000 UTC m=+1122.707346133" Mar 21 05:06:09 crc kubenswrapper[4775]: I0321 05:06:09.982415 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.097593 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-dispersionconf\") pod \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.097726 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-ring-data-devices\") pod \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.098389 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9e521c27-9d67-47bc-b6ac-74fabb543d3f" (UID: "9e521c27-9d67-47bc-b6ac-74fabb543d3f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.097759 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e521c27-9d67-47bc-b6ac-74fabb543d3f-etc-swift\") pod \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.098579 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-combined-ca-bundle\") pod \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.098676 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-swiftconf\") pod \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.098746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r55zr\" (UniqueName: \"kubernetes.io/projected/9e521c27-9d67-47bc-b6ac-74fabb543d3f-kube-api-access-r55zr\") pod \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.098874 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-scripts\") pod \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\" (UID: \"9e521c27-9d67-47bc-b6ac-74fabb543d3f\") " Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.099112 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e521c27-9d67-47bc-b6ac-74fabb543d3f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9e521c27-9d67-47bc-b6ac-74fabb543d3f" (UID: "9e521c27-9d67-47bc-b6ac-74fabb543d3f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.100069 4775 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.100110 4775 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e521c27-9d67-47bc-b6ac-74fabb543d3f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.103758 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e521c27-9d67-47bc-b6ac-74fabb543d3f-kube-api-access-r55zr" (OuterVolumeSpecName: "kube-api-access-r55zr") pod "9e521c27-9d67-47bc-b6ac-74fabb543d3f" (UID: "9e521c27-9d67-47bc-b6ac-74fabb543d3f"). InnerVolumeSpecName "kube-api-access-r55zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.106882 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9e521c27-9d67-47bc-b6ac-74fabb543d3f" (UID: "9e521c27-9d67-47bc-b6ac-74fabb543d3f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.120543 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-scripts" (OuterVolumeSpecName: "scripts") pod "9e521c27-9d67-47bc-b6ac-74fabb543d3f" (UID: "9e521c27-9d67-47bc-b6ac-74fabb543d3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.122299 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9e521c27-9d67-47bc-b6ac-74fabb543d3f" (UID: "9e521c27-9d67-47bc-b6ac-74fabb543d3f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.131497 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e521c27-9d67-47bc-b6ac-74fabb543d3f" (UID: "9e521c27-9d67-47bc-b6ac-74fabb543d3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.201670 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.201721 4775 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.201737 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r55zr\" (UniqueName: \"kubernetes.io/projected/9e521c27-9d67-47bc-b6ac-74fabb543d3f-kube-api-access-r55zr\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.201754 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e521c27-9d67-47bc-b6ac-74fabb543d3f-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.201767 4775 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e521c27-9d67-47bc-b6ac-74fabb543d3f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.673790 4775 generic.go:334] "Generic (PLEG): container finished" podID="17e15fc9-dccb-44f6-9266-2bfe23d9e224" containerID="13179901d0030c432b15b9cdcd3d4bcc8af1fc1231684497164b45d09b84bafe" exitCode=0 Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.673895 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j56nc" event={"ID":"17e15fc9-dccb-44f6-9266-2bfe23d9e224","Type":"ContainerDied","Data":"13179901d0030c432b15b9cdcd3d4bcc8af1fc1231684497164b45d09b84bafe"} Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.682044 4775 generic.go:334] "Generic (PLEG): container finished" podID="1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6" containerID="05fed25f10a4568226dd529e50e8f49b33dd0e98b0199020ea806a0cb77f54ee" exitCode=0 Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.682155 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-41dd-account-create-update-5ltn6" event={"ID":"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6","Type":"ContainerDied","Data":"05fed25f10a4568226dd529e50e8f49b33dd0e98b0199020ea806a0cb77f54ee"} Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.686572 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kzll6" event={"ID":"9e521c27-9d67-47bc-b6ac-74fabb543d3f","Type":"ContainerDied","Data":"eb20ad310e635a1adb6bbb9a38d57d53e3b88c4783ef5987e437714596390e99"} Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.686620 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb20ad310e635a1adb6bbb9a38d57d53e3b88c4783ef5987e437714596390e99" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.686677 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kzll6" Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.692314 4775 generic.go:334] "Generic (PLEG): container finished" podID="a5ecd8b9-b671-4181-96e9-adf3e820c8c7" containerID="a887c3f98f7514e80429067de51d11f1272a6c7ef6bbe82b5e6e6de5b394be30" exitCode=0 Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.692383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-429fj" event={"ID":"a5ecd8b9-b671-4181-96e9-adf3e820c8c7","Type":"ContainerDied","Data":"a887c3f98f7514e80429067de51d11f1272a6c7ef6bbe82b5e6e6de5b394be30"} Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.694781 4775 generic.go:334] "Generic (PLEG): container finished" podID="88db3052-d0c3-4f00-a116-aea162a7790b" containerID="2b7a69f7824c3af4bbbbe389045d25783f39861d479383b702bdac257cb7b2e3" exitCode=0 Mar 21 05:06:10 crc kubenswrapper[4775]: I0321 05:06:10.694965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594f-account-create-update-9ntmw" event={"ID":"88db3052-d0c3-4f00-a116-aea162a7790b","Type":"ContainerDied","Data":"2b7a69f7824c3af4bbbbe389045d25783f39861d479383b702bdac257cb7b2e3"} Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.142089 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tq6zw"] Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.148049 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tq6zw"] Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.190181 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.196387 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fmfms" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.324342 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a151b65-5187-48d2-a29a-30eae42c179c-operator-scripts\") pod \"3a151b65-5187-48d2-a29a-30eae42c179c\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.324445 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c83c73-8982-4c8e-bdeb-204a4874162c-operator-scripts\") pod \"a8c83c73-8982-4c8e-bdeb-204a4874162c\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.324506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8kbf\" (UniqueName: \"kubernetes.io/projected/3a151b65-5187-48d2-a29a-30eae42c179c-kube-api-access-w8kbf\") pod \"3a151b65-5187-48d2-a29a-30eae42c179c\" (UID: \"3a151b65-5187-48d2-a29a-30eae42c179c\") " Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.324554 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5pph\" (UniqueName: \"kubernetes.io/projected/a8c83c73-8982-4c8e-bdeb-204a4874162c-kube-api-access-b5pph\") pod \"a8c83c73-8982-4c8e-bdeb-204a4874162c\" (UID: \"a8c83c73-8982-4c8e-bdeb-204a4874162c\") " Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.325295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c83c73-8982-4c8e-bdeb-204a4874162c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8c83c73-8982-4c8e-bdeb-204a4874162c" (UID: "a8c83c73-8982-4c8e-bdeb-204a4874162c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.325587 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c83c73-8982-4c8e-bdeb-204a4874162c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.325763 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a151b65-5187-48d2-a29a-30eae42c179c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a151b65-5187-48d2-a29a-30eae42c179c" (UID: "3a151b65-5187-48d2-a29a-30eae42c179c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.330523 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c83c73-8982-4c8e-bdeb-204a4874162c-kube-api-access-b5pph" (OuterVolumeSpecName: "kube-api-access-b5pph") pod "a8c83c73-8982-4c8e-bdeb-204a4874162c" (UID: "a8c83c73-8982-4c8e-bdeb-204a4874162c"). InnerVolumeSpecName "kube-api-access-b5pph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.330867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a151b65-5187-48d2-a29a-30eae42c179c-kube-api-access-w8kbf" (OuterVolumeSpecName: "kube-api-access-w8kbf") pod "3a151b65-5187-48d2-a29a-30eae42c179c" (UID: "3a151b65-5187-48d2-a29a-30eae42c179c"). InnerVolumeSpecName "kube-api-access-w8kbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.426635 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a151b65-5187-48d2-a29a-30eae42c179c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.426666 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8kbf\" (UniqueName: \"kubernetes.io/projected/3a151b65-5187-48d2-a29a-30eae42c179c-kube-api-access-w8kbf\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.426676 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5pph\" (UniqueName: \"kubernetes.io/projected/a8c83c73-8982-4c8e-bdeb-204a4874162c-kube-api-access-b5pph\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.677899 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1531c7-e7a2-45cd-a5b9-9678c4808315" path="/var/lib/kubelet/pods/af1531c7-e7a2-45cd-a5b9-9678c4808315/volumes" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.710822 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6de1-account-create-update-bkzxl" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.710842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6de1-account-create-update-bkzxl" event={"ID":"a8c83c73-8982-4c8e-bdeb-204a4874162c","Type":"ContainerDied","Data":"51e33874e51c1b20e7b3774c500e497978209cb064fcdfc29426e7e72f1a0361"} Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.711266 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51e33874e51c1b20e7b3774c500e497978209cb064fcdfc29426e7e72f1a0361" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.713575 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fmfms" Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.713591 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fmfms" event={"ID":"3a151b65-5187-48d2-a29a-30eae42c179c","Type":"ContainerDied","Data":"eed09522f9eaa338fb2dacfffe92955dd1a12162b8fc1fd1c6361c9bf7b0db23"} Mar 21 05:06:11 crc kubenswrapper[4775]: I0321 05:06:11.713658 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eed09522f9eaa338fb2dacfffe92955dd1a12162b8fc1fd1c6361c9bf7b0db23" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.002743 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-429fj" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.069904 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92b4f\" (UniqueName: \"kubernetes.io/projected/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-kube-api-access-92b4f\") pod \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.070038 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-operator-scripts\") pod \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\" (UID: \"a5ecd8b9-b671-4181-96e9-adf3e820c8c7\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.071357 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5ecd8b9-b671-4181-96e9-adf3e820c8c7" (UID: "a5ecd8b9-b671-4181-96e9-adf3e820c8c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.105525 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-kube-api-access-92b4f" (OuterVolumeSpecName: "kube-api-access-92b4f") pod "a5ecd8b9-b671-4181-96e9-adf3e820c8c7" (UID: "a5ecd8b9-b671-4181-96e9-adf3e820c8c7"). InnerVolumeSpecName "kube-api-access-92b4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.171322 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92b4f\" (UniqueName: \"kubernetes.io/projected/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-kube-api-access-92b4f\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.171353 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5ecd8b9-b671-4181-96e9-adf3e820c8c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.265181 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.268601 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.272221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lm6t\" (UniqueName: \"kubernetes.io/projected/17e15fc9-dccb-44f6-9266-2bfe23d9e224-kube-api-access-6lm6t\") pod \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.272292 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-operator-scripts\") pod \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.272311 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e15fc9-dccb-44f6-9266-2bfe23d9e224-operator-scripts\") pod \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\" (UID: \"17e15fc9-dccb-44f6-9266-2bfe23d9e224\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.272329 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8wxz\" (UniqueName: \"kubernetes.io/projected/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-kube-api-access-h8wxz\") pod \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\" (UID: \"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.273517 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6" (UID: "1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.273579 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e15fc9-dccb-44f6-9266-2bfe23d9e224-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17e15fc9-dccb-44f6-9266-2bfe23d9e224" (UID: "17e15fc9-dccb-44f6-9266-2bfe23d9e224"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.275879 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-kube-api-access-h8wxz" (OuterVolumeSpecName: "kube-api-access-h8wxz") pod "1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6" (UID: "1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6"). InnerVolumeSpecName "kube-api-access-h8wxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.279592 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e15fc9-dccb-44f6-9266-2bfe23d9e224-kube-api-access-6lm6t" (OuterVolumeSpecName: "kube-api-access-6lm6t") pod "17e15fc9-dccb-44f6-9266-2bfe23d9e224" (UID: "17e15fc9-dccb-44f6-9266-2bfe23d9e224"). InnerVolumeSpecName "kube-api-access-6lm6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.373454 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lm6t\" (UniqueName: \"kubernetes.io/projected/17e15fc9-dccb-44f6-9266-2bfe23d9e224-kube-api-access-6lm6t\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.373487 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.373496 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e15fc9-dccb-44f6-9266-2bfe23d9e224-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.373504 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8wxz\" (UniqueName: \"kubernetes.io/projected/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6-kube-api-access-h8wxz\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.397272 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.474453 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbh4h\" (UniqueName: \"kubernetes.io/projected/88db3052-d0c3-4f00-a116-aea162a7790b-kube-api-access-tbh4h\") pod \"88db3052-d0c3-4f00-a116-aea162a7790b\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.474560 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88db3052-d0c3-4f00-a116-aea162a7790b-operator-scripts\") pod \"88db3052-d0c3-4f00-a116-aea162a7790b\" (UID: \"88db3052-d0c3-4f00-a116-aea162a7790b\") " Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.475136 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88db3052-d0c3-4f00-a116-aea162a7790b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88db3052-d0c3-4f00-a116-aea162a7790b" (UID: "88db3052-d0c3-4f00-a116-aea162a7790b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.475787 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88db3052-d0c3-4f00-a116-aea162a7790b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.477570 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88db3052-d0c3-4f00-a116-aea162a7790b-kube-api-access-tbh4h" (OuterVolumeSpecName: "kube-api-access-tbh4h") pod "88db3052-d0c3-4f00-a116-aea162a7790b" (UID: "88db3052-d0c3-4f00-a116-aea162a7790b"). InnerVolumeSpecName "kube-api-access-tbh4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.578433 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbh4h\" (UniqueName: \"kubernetes.io/projected/88db3052-d0c3-4f00-a116-aea162a7790b-kube-api-access-tbh4h\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591416 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-td76j"] Mar 21 05:06:12 crc kubenswrapper[4775]: E0321 05:06:12.591763 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591780 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: E0321 05:06:12.591802 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ecd8b9-b671-4181-96e9-adf3e820c8c7" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591811 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ecd8b9-b671-4181-96e9-adf3e820c8c7" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: E0321 05:06:12.591822 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e15fc9-dccb-44f6-9266-2bfe23d9e224" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591830 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e15fc9-dccb-44f6-9266-2bfe23d9e224" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: E0321 05:06:12.591845 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a151b65-5187-48d2-a29a-30eae42c179c" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591853 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a151b65-5187-48d2-a29a-30eae42c179c" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: E0321 05:06:12.591867 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c83c73-8982-4c8e-bdeb-204a4874162c" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591874 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c83c73-8982-4c8e-bdeb-204a4874162c" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: E0321 05:06:12.591890 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e521c27-9d67-47bc-b6ac-74fabb543d3f" containerName="swift-ring-rebalance" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591897 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e521c27-9d67-47bc-b6ac-74fabb543d3f" containerName="swift-ring-rebalance" Mar 21 05:06:12 crc kubenswrapper[4775]: E0321 05:06:12.591923 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88db3052-d0c3-4f00-a116-aea162a7790b" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.591931 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="88db3052-d0c3-4f00-a116-aea162a7790b" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592131 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a151b65-5187-48d2-a29a-30eae42c179c" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592146 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592159 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c83c73-8982-4c8e-bdeb-204a4874162c" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592169 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="88db3052-d0c3-4f00-a116-aea162a7790b" containerName="mariadb-account-create-update" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592182 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ecd8b9-b671-4181-96e9-adf3e820c8c7" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592198 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e521c27-9d67-47bc-b6ac-74fabb543d3f" containerName="swift-ring-rebalance" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592210 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e15fc9-dccb-44f6-9266-2bfe23d9e224" containerName="mariadb-database-create" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.592775 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.603652 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4x9cq" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.603862 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.604361 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-td76j"] Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.679986 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-config-data\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.680084 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-db-sync-config-data\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.680152 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-combined-ca-bundle\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.680215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/716605f1-5111-4e7a-9591-18dfb5da1984-kube-api-access-zmggl\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.722273 4775 generic.go:334] "Generic (PLEG): container finished" podID="839e915e-8197-48e9-8b69-56ac420a1eed" containerID="5a7046cba7a8f5bae4cf5ebd270104b81a62d747ce978d3680ad2d2e0a16d243" exitCode=0 Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.722360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"839e915e-8197-48e9-8b69-56ac420a1eed","Type":"ContainerDied","Data":"5a7046cba7a8f5bae4cf5ebd270104b81a62d747ce978d3680ad2d2e0a16d243"} Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.733222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-41dd-account-create-update-5ltn6" event={"ID":"1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6","Type":"ContainerDied","Data":"fc201b1ec5f6a9dfa6006cd5ad3ef839350e233ae22e75cc42fb3c042862542a"} Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.733272 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc201b1ec5f6a9dfa6006cd5ad3ef839350e233ae22e75cc42fb3c042862542a" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.733332 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-41dd-account-create-update-5ltn6" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.737227 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j56nc" event={"ID":"17e15fc9-dccb-44f6-9266-2bfe23d9e224","Type":"ContainerDied","Data":"c91792969a9636707c8b5d363c9f76819e9d8b1a4df038d739db479fdd4b1ae0"} Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.737266 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91792969a9636707c8b5d363c9f76819e9d8b1a4df038d739db479fdd4b1ae0" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.737341 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j56nc" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.738998 4775 generic.go:334] "Generic (PLEG): container finished" podID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerID="1c4292485e2c0ee4f0c83f962fffc48a32f990f8105220ac94e44a4691b7ff1f" exitCode=0 Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.739070 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"375fb8b7-b673-4fd7-ae51-5f82f33c196f","Type":"ContainerDied","Data":"1c4292485e2c0ee4f0c83f962fffc48a32f990f8105220ac94e44a4691b7ff1f"} Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.744131 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-429fj" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.744205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-429fj" event={"ID":"a5ecd8b9-b671-4181-96e9-adf3e820c8c7","Type":"ContainerDied","Data":"54d2934244670d8ac6794aed279432efac3b71d073195ae0ffbaf75a83df79af"} Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.744251 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54d2934244670d8ac6794aed279432efac3b71d073195ae0ffbaf75a83df79af" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.747357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-594f-account-create-update-9ntmw" event={"ID":"88db3052-d0c3-4f00-a116-aea162a7790b","Type":"ContainerDied","Data":"b730e73ba2321274902ab7bcdcdddf4bc746dc375617551c9d0547ea06be886b"} Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.747505 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b730e73ba2321274902ab7bcdcdddf4bc746dc375617551c9d0547ea06be886b" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.747393 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-594f-account-create-update-9ntmw" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.783085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-combined-ca-bundle\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.783190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/716605f1-5111-4e7a-9591-18dfb5da1984-kube-api-access-zmggl\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.783247 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-config-data\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.783328 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-db-sync-config-data\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.791974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-db-sync-config-data\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.798919 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-combined-ca-bundle\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.800842 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-config-data\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.801516 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/716605f1-5111-4e7a-9591-18dfb5da1984-kube-api-access-zmggl\") pod \"glance-db-sync-td76j\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " pod="openstack/glance-db-sync-td76j" Mar 21 05:06:12 crc kubenswrapper[4775]: I0321 05:06:12.924581 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-td76j" Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.501855 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-td76j"] Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.757309 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"375fb8b7-b673-4fd7-ae51-5f82f33c196f","Type":"ContainerStarted","Data":"2b205fc4ed697ad5cbf0298ac1f6315a11af8ff066df4aa11b086ace7ab697d3"} Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.757493 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.759372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-td76j" event={"ID":"716605f1-5111-4e7a-9591-18dfb5da1984","Type":"ContainerStarted","Data":"c9c92c1914b54290237327698e3251a34ba7a8a08da38fd6bd3a1901ce249b25"} Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.761661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"839e915e-8197-48e9-8b69-56ac420a1eed","Type":"ContainerStarted","Data":"c1cc3f5c012ce3f528f11e083192eb0842f7dad8501df783855a594cab5449da"} Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.761878 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.779613 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.423319839 podStartE2EDuration="51.779595948s" podCreationTimestamp="2026-03-21 05:05:22 +0000 UTC" firstStartedPulling="2026-03-21 05:05:27.445217766 +0000 UTC m=+1080.421681390" lastFinishedPulling="2026-03-21 05:05:38.801493875 +0000 UTC m=+1091.777957499" observedRunningTime="2026-03-21 05:06:13.778086215 +0000 UTC m=+1126.754549849" watchObservedRunningTime="2026-03-21 05:06:13.779595948 +0000 UTC m=+1126.756059572" Mar 21 05:06:13 crc kubenswrapper[4775]: I0321 05:06:13.802943 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.426032136 podStartE2EDuration="50.802923467s" podCreationTimestamp="2026-03-21 05:05:23 +0000 UTC" firstStartedPulling="2026-03-21 05:05:27.445726631 +0000 UTC m=+1080.422190255" lastFinishedPulling="2026-03-21 05:05:38.822617962 +0000 UTC m=+1091.799081586" observedRunningTime="2026-03-21 05:06:13.799615374 +0000 UTC m=+1126.776078998" watchObservedRunningTime="2026-03-21 05:06:13.802923467 +0000 UTC m=+1126.779387091" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.152545 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-drcfj"] Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.153585 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.155794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.167152 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-drcfj"] Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.235618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-operator-scripts\") pod \"root-account-create-update-drcfj\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.235704 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csdp7\" (UniqueName: \"kubernetes.io/projected/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-kube-api-access-csdp7\") pod \"root-account-create-update-drcfj\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.269395 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.337304 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-operator-scripts\") pod \"root-account-create-update-drcfj\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.337428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csdp7\" (UniqueName: \"kubernetes.io/projected/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-kube-api-access-csdp7\") pod \"root-account-create-update-drcfj\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.338135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-operator-scripts\") pod \"root-account-create-update-drcfj\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.360369 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csdp7\" (UniqueName: \"kubernetes.io/projected/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-kube-api-access-csdp7\") pod \"root-account-create-update-drcfj\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.475297 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:16 crc kubenswrapper[4775]: I0321 05:06:16.981366 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-drcfj"] Mar 21 05:06:17 crc kubenswrapper[4775]: I0321 05:06:17.796380 4775 generic.go:334] "Generic (PLEG): container finished" podID="b4c6072b-3aa5-43ef-be24-f9b20e5095bd" containerID="db02c0eaade2846804c5f5491c8f4869f16bd0a0624e9878aa43eba4fd63b5ed" exitCode=0 Mar 21 05:06:17 crc kubenswrapper[4775]: I0321 05:06:17.796446 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-drcfj" event={"ID":"b4c6072b-3aa5-43ef-be24-f9b20e5095bd","Type":"ContainerDied","Data":"db02c0eaade2846804c5f5491c8f4869f16bd0a0624e9878aa43eba4fd63b5ed"} Mar 21 05:06:17 crc kubenswrapper[4775]: I0321 05:06:17.796673 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-drcfj" event={"ID":"b4c6072b-3aa5-43ef-be24-f9b20e5095bd","Type":"ContainerStarted","Data":"58910afada9e3bd245c22f7b8d6624bba6e27bd44ec98c040a3ae3b0b6c85caf"} Mar 21 05:06:17 crc kubenswrapper[4775]: I0321 05:06:17.841631 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nmtjx" podUID="8a8e948c-2978-40c8-961b-1b010f7ea920" containerName="ovn-controller" probeResult="failure" output=< Mar 21 05:06:17 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 05:06:17 crc kubenswrapper[4775]: > Mar 21 05:06:22 crc kubenswrapper[4775]: I0321 05:06:22.830164 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nmtjx" podUID="8a8e948c-2978-40c8-961b-1b010f7ea920" containerName="ovn-controller" probeResult="failure" output=< Mar 21 05:06:22 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 05:06:22 crc kubenswrapper[4775]: > Mar 21 05:06:22 crc kubenswrapper[4775]: I0321 05:06:22.876468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:06:22 crc kubenswrapper[4775]: I0321 05:06:22.896867 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e93b938-c138-4cfc-a227-e1cd648ad59a-etc-swift\") pod \"swift-storage-0\" (UID: \"8e93b938-c138-4cfc-a227-e1cd648ad59a\") " pod="openstack/swift-storage-0" Mar 21 05:06:22 crc kubenswrapper[4775]: I0321 05:06:22.904111 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:06:22 crc kubenswrapper[4775]: I0321 05:06:22.905967 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-frhpj" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.042143 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.131495 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nmtjx-config-pjpqh"] Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.134564 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.137980 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.160786 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nmtjx-config-pjpqh"] Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.288555 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrlpt\" (UniqueName: \"kubernetes.io/projected/b6526a91-d74c-4b54-9393-979ed50d1df5-kube-api-access-zrlpt\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.288906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.288939 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-log-ovn\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.288995 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-scripts\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.289057 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run-ovn\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.289160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-additional-scripts\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.390435 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-additional-scripts\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.390503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrlpt\" (UniqueName: \"kubernetes.io/projected/b6526a91-d74c-4b54-9393-979ed50d1df5-kube-api-access-zrlpt\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.390561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.390595 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-log-ovn\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.390623 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-scripts\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.390666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run-ovn\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.390996 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run-ovn\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.391421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.391819 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-additional-scripts\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.393098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-scripts\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.393213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-log-ovn\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.412732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrlpt\" (UniqueName: \"kubernetes.io/projected/b6526a91-d74c-4b54-9393-979ed50d1df5-kube-api-access-zrlpt\") pod \"ovn-controller-nmtjx-config-pjpqh\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:23 crc kubenswrapper[4775]: I0321 05:06:23.472234 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:24 crc kubenswrapper[4775]: I0321 05:06:24.497497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:06:24 crc kubenswrapper[4775]: I0321 05:06:24.564452 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.539828 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-9vtvz"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.541644 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.550948 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9vtvz"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.679916 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rbcrx"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.681369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-operator-scripts\") pod \"cinder-db-create-9vtvz\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.681501 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd94l\" (UniqueName: \"kubernetes.io/projected/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-kube-api-access-gd94l\") pod \"cinder-db-create-9vtvz\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.682495 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.697453 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6ca3-account-create-update-p8m7j"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.698495 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.701849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.706900 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rbcrx"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.732197 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6ca3-account-create-update-p8m7j"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.772546 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-x5t7w"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.773818 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.781563 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-x5t7w"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.782725 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-operator-scripts\") pod \"cinder-db-create-9vtvz\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.782789 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-operator-scripts\") pod \"cinder-6ca3-account-create-update-p8m7j\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.782843 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvvp\" (UniqueName: \"kubernetes.io/projected/32831ef2-2e09-453b-9bea-dffe7423fa37-kube-api-access-snvvp\") pod \"barbican-db-create-rbcrx\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.782887 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd94l\" (UniqueName: \"kubernetes.io/projected/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-kube-api-access-gd94l\") pod \"cinder-db-create-9vtvz\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.782931 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvw5k\" (UniqueName: \"kubernetes.io/projected/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-kube-api-access-qvw5k\") pod \"cinder-6ca3-account-create-update-p8m7j\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.782999 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32831ef2-2e09-453b-9bea-dffe7423fa37-operator-scripts\") pod \"barbican-db-create-rbcrx\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.786488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-operator-scripts\") pod \"cinder-db-create-9vtvz\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.806870 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd94l\" (UniqueName: \"kubernetes.io/projected/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-kube-api-access-gd94l\") pod \"cinder-db-create-9vtvz\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.857471 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-7a73-account-create-update-jvhc8"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.860202 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.862946 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.867685 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.868662 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7a73-account-create-update-jvhc8"] Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.884167 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-operator-scripts\") pod \"cinder-6ca3-account-create-update-p8m7j\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.884219 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snvvp\" (UniqueName: \"kubernetes.io/projected/32831ef2-2e09-453b-9bea-dffe7423fa37-kube-api-access-snvvp\") pod \"barbican-db-create-rbcrx\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.884255 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74dea94d-fe8b-49e3-9730-fbb641cafaab-operator-scripts\") pod \"neutron-db-create-x5t7w\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.884276 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvw5k\" (UniqueName: \"kubernetes.io/projected/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-kube-api-access-qvw5k\") pod \"cinder-6ca3-account-create-update-p8m7j\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.884317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfdt5\" (UniqueName: \"kubernetes.io/projected/74dea94d-fe8b-49e3-9730-fbb641cafaab-kube-api-access-kfdt5\") pod \"neutron-db-create-x5t7w\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.884332 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32831ef2-2e09-453b-9bea-dffe7423fa37-operator-scripts\") pod \"barbican-db-create-rbcrx\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.885469 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32831ef2-2e09-453b-9bea-dffe7423fa37-operator-scripts\") pod \"barbican-db-create-rbcrx\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.885653 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-operator-scripts\") pod \"cinder-6ca3-account-create-update-p8m7j\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.919961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvvp\" (UniqueName: \"kubernetes.io/projected/32831ef2-2e09-453b-9bea-dffe7423fa37-kube-api-access-snvvp\") pod \"barbican-db-create-rbcrx\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.921065 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvw5k\" (UniqueName: \"kubernetes.io/projected/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-kube-api-access-qvw5k\") pod \"cinder-6ca3-account-create-update-p8m7j\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.986407 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sglvc\" (UniqueName: \"kubernetes.io/projected/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-kube-api-access-sglvc\") pod \"barbican-7a73-account-create-update-jvhc8\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.986506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-operator-scripts\") pod \"barbican-7a73-account-create-update-jvhc8\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.986558 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74dea94d-fe8b-49e3-9730-fbb641cafaab-operator-scripts\") pod \"neutron-db-create-x5t7w\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.986617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfdt5\" (UniqueName: \"kubernetes.io/projected/74dea94d-fe8b-49e3-9730-fbb641cafaab-kube-api-access-kfdt5\") pod \"neutron-db-create-x5t7w\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:26 crc kubenswrapper[4775]: I0321 05:06:26.987479 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74dea94d-fe8b-49e3-9730-fbb641cafaab-operator-scripts\") pod \"neutron-db-create-x5t7w\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.006235 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.014615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfdt5\" (UniqueName: \"kubernetes.io/projected/74dea94d-fe8b-49e3-9730-fbb641cafaab-kube-api-access-kfdt5\") pod \"neutron-db-create-x5t7w\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.020301 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.041308 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dl5n2"] Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.043042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.044935 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.045195 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.045696 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nhwlw" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.046137 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.059182 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dl5n2"] Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.088566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sglvc\" (UniqueName: \"kubernetes.io/projected/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-kube-api-access-sglvc\") pod \"barbican-7a73-account-create-update-jvhc8\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.088643 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-operator-scripts\") pod \"barbican-7a73-account-create-update-jvhc8\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.089599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-operator-scripts\") pod \"barbican-7a73-account-create-update-jvhc8\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.089784 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.092241 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b34c-account-create-update-6kvs4"] Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.093628 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.096008 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.101077 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b34c-account-create-update-6kvs4"] Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.122577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sglvc\" (UniqueName: \"kubernetes.io/projected/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-kube-api-access-sglvc\") pod \"barbican-7a73-account-create-update-jvhc8\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.190204 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.190882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhb4\" (UniqueName: \"kubernetes.io/projected/47698618-487f-4849-b179-34398850f0e0-kube-api-access-lwhb4\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.191256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgwr5\" (UniqueName: \"kubernetes.io/projected/762f4dd9-9a96-4cdb-aa87-e181f5959140-kube-api-access-kgwr5\") pod \"neutron-b34c-account-create-update-6kvs4\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.191378 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-config-data\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.191505 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-combined-ca-bundle\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.191620 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762f4dd9-9a96-4cdb-aa87-e181f5959140-operator-scripts\") pod \"neutron-b34c-account-create-update-6kvs4\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.292958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhb4\" (UniqueName: \"kubernetes.io/projected/47698618-487f-4849-b179-34398850f0e0-kube-api-access-lwhb4\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.293287 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgwr5\" (UniqueName: \"kubernetes.io/projected/762f4dd9-9a96-4cdb-aa87-e181f5959140-kube-api-access-kgwr5\") pod \"neutron-b34c-account-create-update-6kvs4\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.293450 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-config-data\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.293566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-combined-ca-bundle\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.293670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762f4dd9-9a96-4cdb-aa87-e181f5959140-operator-scripts\") pod \"neutron-b34c-account-create-update-6kvs4\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.294561 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762f4dd9-9a96-4cdb-aa87-e181f5959140-operator-scripts\") pod \"neutron-b34c-account-create-update-6kvs4\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.297387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-combined-ca-bundle\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.297561 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-config-data\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.311589 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgwr5\" (UniqueName: \"kubernetes.io/projected/762f4dd9-9a96-4cdb-aa87-e181f5959140-kube-api-access-kgwr5\") pod \"neutron-b34c-account-create-update-6kvs4\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.315148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhb4\" (UniqueName: \"kubernetes.io/projected/47698618-487f-4849-b179-34398850f0e0-kube-api-access-lwhb4\") pod \"keystone-db-sync-dl5n2\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.365286 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.414866 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:27 crc kubenswrapper[4775]: I0321 05:06:27.842761 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nmtjx" podUID="8a8e948c-2978-40c8-961b-1b010f7ea920" containerName="ovn-controller" probeResult="failure" output=< Mar 21 05:06:27 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 05:06:27 crc kubenswrapper[4775]: > Mar 21 05:06:31 crc kubenswrapper[4775]: E0321 05:06:31.045790 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 21 05:06:31 crc kubenswrapper[4775]: E0321 05:06:31.046683 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zmggl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-td76j_openstack(716605f1-5111-4e7a-9591-18dfb5da1984): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:06:31 crc kubenswrapper[4775]: E0321 05:06:31.049218 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-td76j" podUID="716605f1-5111-4e7a-9591-18dfb5da1984" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.094716 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.256985 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csdp7\" (UniqueName: \"kubernetes.io/projected/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-kube-api-access-csdp7\") pod \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.257089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-operator-scripts\") pod \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\" (UID: \"b4c6072b-3aa5-43ef-be24-f9b20e5095bd\") " Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.258098 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4c6072b-3aa5-43ef-be24-f9b20e5095bd" (UID: "b4c6072b-3aa5-43ef-be24-f9b20e5095bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.258309 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.265866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-kube-api-access-csdp7" (OuterVolumeSpecName: "kube-api-access-csdp7") pod "b4c6072b-3aa5-43ef-be24-f9b20e5095bd" (UID: "b4c6072b-3aa5-43ef-be24-f9b20e5095bd"). InnerVolumeSpecName "kube-api-access-csdp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.359723 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csdp7\" (UniqueName: \"kubernetes.io/projected/b4c6072b-3aa5-43ef-be24-f9b20e5095bd-kube-api-access-csdp7\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.700710 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dl5n2"] Mar 21 05:06:31 crc kubenswrapper[4775]: W0321 05:06:31.703530 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47698618_487f_4849_b179_34398850f0e0.slice/crio-75aba715be05adf1dd95f2ce6be9dc16fa293f4f6be54a7933d8a4a59ff5f394 WatchSource:0}: Error finding container 75aba715be05adf1dd95f2ce6be9dc16fa293f4f6be54a7933d8a4a59ff5f394: Status 404 returned error can't find the container with id 75aba715be05adf1dd95f2ce6be9dc16fa293f4f6be54a7933d8a4a59ff5f394 Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.797522 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.869140 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-x5t7w"] Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.869208 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6ca3-account-create-update-p8m7j"] Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.877067 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7a73-account-create-update-jvhc8"] Mar 21 05:06:31 crc kubenswrapper[4775]: W0321 05:06:31.882636 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6526a91_d74c_4b54_9393_979ed50d1df5.slice/crio-f29fab679d4709d5420d49a0724f8a93a23a7a95f4f911c1ed5c1299a2d4b274 WatchSource:0}: Error finding container f29fab679d4709d5420d49a0724f8a93a23a7a95f4f911c1ed5c1299a2d4b274: Status 404 returned error can't find the container with id f29fab679d4709d5420d49a0724f8a93a23a7a95f4f911c1ed5c1299a2d4b274 Mar 21 05:06:31 crc kubenswrapper[4775]: W0321 05:06:31.883906 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32831ef2_2e09_453b_9bea_dffe7423fa37.slice/crio-68e92a8c80718e12444d64c7c94f999e3782ac0b507338d6f918f69629f1afa8 WatchSource:0}: Error finding container 68e92a8c80718e12444d64c7c94f999e3782ac0b507338d6f918f69629f1afa8: Status 404 returned error can't find the container with id 68e92a8c80718e12444d64c7c94f999e3782ac0b507338d6f918f69629f1afa8 Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.884311 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b34c-account-create-update-6kvs4"] Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.885915 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.887243 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.893263 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nmtjx-config-pjpqh"] Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.900316 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rbcrx"] Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.900655 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.945496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-drcfj" event={"ID":"b4c6072b-3aa5-43ef-be24-f9b20e5095bd","Type":"ContainerDied","Data":"58910afada9e3bd245c22f7b8d6624bba6e27bd44ec98c040a3ae3b0b6c85caf"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.945543 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58910afada9e3bd245c22f7b8d6624bba6e27bd44ec98c040a3ae3b0b6c85caf" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.945518 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-drcfj" Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.947561 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-x5t7w" event={"ID":"74dea94d-fe8b-49e3-9730-fbb641cafaab","Type":"ContainerStarted","Data":"b7911e665933d0bf77ca85fcb51a440cd8a4af416d859020d112bc521d195743"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.949695 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nmtjx-config-pjpqh" event={"ID":"b6526a91-d74c-4b54-9393-979ed50d1df5","Type":"ContainerStarted","Data":"f29fab679d4709d5420d49a0724f8a93a23a7a95f4f911c1ed5c1299a2d4b274"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.951095 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"4269ad36bf0a4e521266e7451d179b0ede5228e98c496438299414388c2d9328"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.952468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b34c-account-create-update-6kvs4" event={"ID":"762f4dd9-9a96-4cdb-aa87-e181f5959140","Type":"ContainerStarted","Data":"9c1420f842e90da606dfacce213e4c6be91dcbb1dc3e5c0c7991d7ffc54d4aad"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.953684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6ca3-account-create-update-p8m7j" event={"ID":"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147","Type":"ContainerStarted","Data":"834109a164cebc40b9da57c2e383b99039af503bfbbdbafe954a3b8ce2a534e3"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.955843 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dl5n2" event={"ID":"47698618-487f-4849-b179-34398850f0e0","Type":"ContainerStarted","Data":"75aba715be05adf1dd95f2ce6be9dc16fa293f4f6be54a7933d8a4a59ff5f394"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.960477 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7a73-account-create-update-jvhc8" event={"ID":"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b","Type":"ContainerStarted","Data":"43335bb44823d6794353cce5f074df6b49fb1095d0314a1a014838b6ff135074"} Mar 21 05:06:31 crc kubenswrapper[4775]: I0321 05:06:31.961962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rbcrx" event={"ID":"32831ef2-2e09-453b-9bea-dffe7423fa37","Type":"ContainerStarted","Data":"68e92a8c80718e12444d64c7c94f999e3782ac0b507338d6f918f69629f1afa8"} Mar 21 05:06:31 crc kubenswrapper[4775]: E0321 05:06:31.963504 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-td76j" podUID="716605f1-5111-4e7a-9591-18dfb5da1984" Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.122968 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9vtvz"] Mar 21 05:06:32 crc kubenswrapper[4775]: W0321 05:06:32.157048 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94ac9553_1ce0_4c22_9b2a_e7f38b75a6d6.slice/crio-76ed179c77d4582dff3583fe5aa8861bdab403dbd995245cbd2608d01ae0d097 WatchSource:0}: Error finding container 76ed179c77d4582dff3583fe5aa8861bdab403dbd995245cbd2608d01ae0d097: Status 404 returned error can't find the container with id 76ed179c77d4582dff3583fe5aa8861bdab403dbd995245cbd2608d01ae0d097 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.828826 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nmtjx" Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.970845 4775 generic.go:334] "Generic (PLEG): container finished" podID="94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6" containerID="3d3602be9c806dde89f844d57f5ed730a681b0e2d28bcdfcf3fd72611d393bf0" exitCode=0 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.970897 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9vtvz" event={"ID":"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6","Type":"ContainerDied","Data":"3d3602be9c806dde89f844d57f5ed730a681b0e2d28bcdfcf3fd72611d393bf0"} Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.970947 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9vtvz" event={"ID":"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6","Type":"ContainerStarted","Data":"76ed179c77d4582dff3583fe5aa8861bdab403dbd995245cbd2608d01ae0d097"} Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.973283 4775 generic.go:334] "Generic (PLEG): container finished" podID="9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b" containerID="5d0abdc2d7bcbfcb69006c6e61f11899c9c6ba0f99f80bd6ef0c79dc4299f856" exitCode=0 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.973418 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7a73-account-create-update-jvhc8" event={"ID":"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b","Type":"ContainerDied","Data":"5d0abdc2d7bcbfcb69006c6e61f11899c9c6ba0f99f80bd6ef0c79dc4299f856"} Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.980032 4775 generic.go:334] "Generic (PLEG): container finished" podID="32831ef2-2e09-453b-9bea-dffe7423fa37" containerID="9349131633ffa81b213e3981f82d949940d821489cd39e0d255301316c6d40ec" exitCode=0 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.980190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rbcrx" event={"ID":"32831ef2-2e09-453b-9bea-dffe7423fa37","Type":"ContainerDied","Data":"9349131633ffa81b213e3981f82d949940d821489cd39e0d255301316c6d40ec"} Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.983500 4775 generic.go:334] "Generic (PLEG): container finished" podID="74dea94d-fe8b-49e3-9730-fbb641cafaab" containerID="9076af8667b17ded58ed73ceb5a5784857256d7d90cd1bdaecae68562f5cf370" exitCode=0 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.983611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-x5t7w" event={"ID":"74dea94d-fe8b-49e3-9730-fbb641cafaab","Type":"ContainerDied","Data":"9076af8667b17ded58ed73ceb5a5784857256d7d90cd1bdaecae68562f5cf370"} Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.991688 4775 generic.go:334] "Generic (PLEG): container finished" podID="b6526a91-d74c-4b54-9393-979ed50d1df5" containerID="af6715536a4cd1315ee5f22687cb60f190dfcee964ee03a3fb9b9b38cd6a20b3" exitCode=0 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.991721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nmtjx-config-pjpqh" event={"ID":"b6526a91-d74c-4b54-9393-979ed50d1df5","Type":"ContainerDied","Data":"af6715536a4cd1315ee5f22687cb60f190dfcee964ee03a3fb9b9b38cd6a20b3"} Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.993823 4775 generic.go:334] "Generic (PLEG): container finished" podID="762f4dd9-9a96-4cdb-aa87-e181f5959140" containerID="7ca2f730d3508ebef97a36404de20e26217deca533b74b90b3415ce6b5463328" exitCode=0 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.993891 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b34c-account-create-update-6kvs4" event={"ID":"762f4dd9-9a96-4cdb-aa87-e181f5959140","Type":"ContainerDied","Data":"7ca2f730d3508ebef97a36404de20e26217deca533b74b90b3415ce6b5463328"} Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.995771 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3e27f70-6ea1-4f2e-a42e-e65c8ba76147" containerID="612cd0e0df303a73de827afa0ae711fb6535100e6b3d945953567e8f21be3662" exitCode=0 Mar 21 05:06:32 crc kubenswrapper[4775]: I0321 05:06:32.995806 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6ca3-account-create-update-p8m7j" event={"ID":"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147","Type":"ContainerDied","Data":"612cd0e0df303a73de827afa0ae711fb6535100e6b3d945953567e8f21be3662"} Mar 21 05:06:34 crc kubenswrapper[4775]: I0321 05:06:34.019068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"1be1e6985bc52afd6c350d60284cefc0e2c924ef8d1e7b54694759d2674029af"} Mar 21 05:06:34 crc kubenswrapper[4775]: I0321 05:06:34.019470 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"25d6d9373325eab7f8bc86e0cf14ba841ba38f0d3415587346c6c57f058a5065"} Mar 21 05:06:34 crc kubenswrapper[4775]: I0321 05:06:34.019488 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"5838046ae2e7d79c32bd035f9302cc2264bbb95880c32b0c68b16eb1c97abe8f"} Mar 21 05:06:34 crc kubenswrapper[4775]: I0321 05:06:34.019499 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"45dfc348c555083ad3ba97211fa33d552ff0a2755e3d42125978389eaf750637"} Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.825226 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.858693 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.883307 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.889163 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgwr5\" (UniqueName: \"kubernetes.io/projected/762f4dd9-9a96-4cdb-aa87-e181f5959140-kube-api-access-kgwr5\") pod \"762f4dd9-9a96-4cdb-aa87-e181f5959140\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890273 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd94l\" (UniqueName: \"kubernetes.io/projected/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-kube-api-access-gd94l\") pod \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890315 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-operator-scripts\") pod \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\" (UID: \"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890336 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snvvp\" (UniqueName: \"kubernetes.io/projected/32831ef2-2e09-453b-9bea-dffe7423fa37-kube-api-access-snvvp\") pod \"32831ef2-2e09-453b-9bea-dffe7423fa37\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890351 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-operator-scripts\") pod \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890368 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762f4dd9-9a96-4cdb-aa87-e181f5959140-operator-scripts\") pod \"762f4dd9-9a96-4cdb-aa87-e181f5959140\" (UID: \"762f4dd9-9a96-4cdb-aa87-e181f5959140\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890403 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvw5k\" (UniqueName: \"kubernetes.io/projected/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-kube-api-access-qvw5k\") pod \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\" (UID: \"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.890437 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32831ef2-2e09-453b-9bea-dffe7423fa37-operator-scripts\") pod \"32831ef2-2e09-453b-9bea-dffe7423fa37\" (UID: \"32831ef2-2e09-453b-9bea-dffe7423fa37\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.891219 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32831ef2-2e09-453b-9bea-dffe7423fa37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32831ef2-2e09-453b-9bea-dffe7423fa37" (UID: "32831ef2-2e09-453b-9bea-dffe7423fa37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.891264 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6" (UID: "94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.891758 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3e27f70-6ea1-4f2e-a42e-e65c8ba76147" (UID: "d3e27f70-6ea1-4f2e-a42e-e65c8ba76147"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.891790 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/762f4dd9-9a96-4cdb-aa87-e181f5959140-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "762f4dd9-9a96-4cdb-aa87-e181f5959140" (UID: "762f4dd9-9a96-4cdb-aa87-e181f5959140"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.899325 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-kube-api-access-gd94l" (OuterVolumeSpecName: "kube-api-access-gd94l") pod "94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6" (UID: "94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6"). InnerVolumeSpecName "kube-api-access-gd94l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.900274 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-kube-api-access-qvw5k" (OuterVolumeSpecName: "kube-api-access-qvw5k") pod "d3e27f70-6ea1-4f2e-a42e-e65c8ba76147" (UID: "d3e27f70-6ea1-4f2e-a42e-e65c8ba76147"). InnerVolumeSpecName "kube-api-access-qvw5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.900338 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762f4dd9-9a96-4cdb-aa87-e181f5959140-kube-api-access-kgwr5" (OuterVolumeSpecName: "kube-api-access-kgwr5") pod "762f4dd9-9a96-4cdb-aa87-e181f5959140" (UID: "762f4dd9-9a96-4cdb-aa87-e181f5959140"). InnerVolumeSpecName "kube-api-access-kgwr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.904151 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32831ef2-2e09-453b-9bea-dffe7423fa37-kube-api-access-snvvp" (OuterVolumeSpecName: "kube-api-access-snvvp") pod "32831ef2-2e09-453b-9bea-dffe7423fa37" (UID: "32831ef2-2e09-453b-9bea-dffe7423fa37"). InnerVolumeSpecName "kube-api-access-snvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.904742 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.912740 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.932655 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991293 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sglvc\" (UniqueName: \"kubernetes.io/projected/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-kube-api-access-sglvc\") pod \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991353 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-additional-scripts\") pod \"b6526a91-d74c-4b54-9393-979ed50d1df5\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991374 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrlpt\" (UniqueName: \"kubernetes.io/projected/b6526a91-d74c-4b54-9393-979ed50d1df5-kube-api-access-zrlpt\") pod \"b6526a91-d74c-4b54-9393-979ed50d1df5\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991393 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-scripts\") pod \"b6526a91-d74c-4b54-9393-979ed50d1df5\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991417 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74dea94d-fe8b-49e3-9730-fbb641cafaab-operator-scripts\") pod \"74dea94d-fe8b-49e3-9730-fbb641cafaab\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991443 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-log-ovn\") pod \"b6526a91-d74c-4b54-9393-979ed50d1df5\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991511 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfdt5\" (UniqueName: \"kubernetes.io/projected/74dea94d-fe8b-49e3-9730-fbb641cafaab-kube-api-access-kfdt5\") pod \"74dea94d-fe8b-49e3-9730-fbb641cafaab\" (UID: \"74dea94d-fe8b-49e3-9730-fbb641cafaab\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991531 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-operator-scripts\") pod \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\" (UID: \"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991557 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run-ovn\") pod \"b6526a91-d74c-4b54-9393-979ed50d1df5\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991578 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run\") pod \"b6526a91-d74c-4b54-9393-979ed50d1df5\" (UID: \"b6526a91-d74c-4b54-9393-979ed50d1df5\") " Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991845 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgwr5\" (UniqueName: \"kubernetes.io/projected/762f4dd9-9a96-4cdb-aa87-e181f5959140-kube-api-access-kgwr5\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991867 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd94l\" (UniqueName: \"kubernetes.io/projected/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-kube-api-access-gd94l\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991878 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991888 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snvvp\" (UniqueName: \"kubernetes.io/projected/32831ef2-2e09-453b-9bea-dffe7423fa37-kube-api-access-snvvp\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991896 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991904 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/762f4dd9-9a96-4cdb-aa87-e181f5959140-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991913 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvw5k\" (UniqueName: \"kubernetes.io/projected/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147-kube-api-access-qvw5k\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991922 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32831ef2-2e09-453b-9bea-dffe7423fa37-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991947 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run" (OuterVolumeSpecName: "var-run") pod "b6526a91-d74c-4b54-9393-979ed50d1df5" (UID: "b6526a91-d74c-4b54-9393-979ed50d1df5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.991976 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6526a91-d74c-4b54-9393-979ed50d1df5" (UID: "b6526a91-d74c-4b54-9393-979ed50d1df5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.992054 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74dea94d-fe8b-49e3-9730-fbb641cafaab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74dea94d-fe8b-49e3-9730-fbb641cafaab" (UID: "74dea94d-fe8b-49e3-9730-fbb641cafaab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.992215 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6526a91-d74c-4b54-9393-979ed50d1df5" (UID: "b6526a91-d74c-4b54-9393-979ed50d1df5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.992488 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b6526a91-d74c-4b54-9393-979ed50d1df5" (UID: "b6526a91-d74c-4b54-9393-979ed50d1df5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.992567 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b" (UID: "9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.992885 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-scripts" (OuterVolumeSpecName: "scripts") pod "b6526a91-d74c-4b54-9393-979ed50d1df5" (UID: "b6526a91-d74c-4b54-9393-979ed50d1df5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.994725 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-kube-api-access-sglvc" (OuterVolumeSpecName: "kube-api-access-sglvc") pod "9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b" (UID: "9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b"). InnerVolumeSpecName "kube-api-access-sglvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.995628 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74dea94d-fe8b-49e3-9730-fbb641cafaab-kube-api-access-kfdt5" (OuterVolumeSpecName: "kube-api-access-kfdt5") pod "74dea94d-fe8b-49e3-9730-fbb641cafaab" (UID: "74dea94d-fe8b-49e3-9730-fbb641cafaab"). InnerVolumeSpecName "kube-api-access-kfdt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:36 crc kubenswrapper[4775]: I0321 05:06:36.995667 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6526a91-d74c-4b54-9393-979ed50d1df5-kube-api-access-zrlpt" (OuterVolumeSpecName: "kube-api-access-zrlpt") pod "b6526a91-d74c-4b54-9393-979ed50d1df5" (UID: "b6526a91-d74c-4b54-9393-979ed50d1df5"). InnerVolumeSpecName "kube-api-access-zrlpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.042868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7a73-account-create-update-jvhc8" event={"ID":"9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b","Type":"ContainerDied","Data":"43335bb44823d6794353cce5f074df6b49fb1095d0314a1a014838b6ff135074"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.042927 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a73-account-create-update-jvhc8" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.042935 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43335bb44823d6794353cce5f074df6b49fb1095d0314a1a014838b6ff135074" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.044601 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rbcrx" event={"ID":"32831ef2-2e09-453b-9bea-dffe7423fa37","Type":"ContainerDied","Data":"68e92a8c80718e12444d64c7c94f999e3782ac0b507338d6f918f69629f1afa8"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.044635 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e92a8c80718e12444d64c7c94f999e3782ac0b507338d6f918f69629f1afa8" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.044651 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rbcrx" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.046032 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-x5t7w" event={"ID":"74dea94d-fe8b-49e3-9730-fbb641cafaab","Type":"ContainerDied","Data":"b7911e665933d0bf77ca85fcb51a440cd8a4af416d859020d112bc521d195743"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.046064 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7911e665933d0bf77ca85fcb51a440cd8a4af416d859020d112bc521d195743" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.046174 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-x5t7w" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.056940 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nmtjx-config-pjpqh" event={"ID":"b6526a91-d74c-4b54-9393-979ed50d1df5","Type":"ContainerDied","Data":"f29fab679d4709d5420d49a0724f8a93a23a7a95f4f911c1ed5c1299a2d4b274"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.056963 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f29fab679d4709d5420d49a0724f8a93a23a7a95f4f911c1ed5c1299a2d4b274" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.057005 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nmtjx-config-pjpqh" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.060325 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b34c-account-create-update-6kvs4" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.060618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b34c-account-create-update-6kvs4" event={"ID":"762f4dd9-9a96-4cdb-aa87-e181f5959140","Type":"ContainerDied","Data":"9c1420f842e90da606dfacce213e4c6be91dcbb1dc3e5c0c7991d7ffc54d4aad"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.060644 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1420f842e90da606dfacce213e4c6be91dcbb1dc3e5c0c7991d7ffc54d4aad" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.062407 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6ca3-account-create-update-p8m7j" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.062414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6ca3-account-create-update-p8m7j" event={"ID":"d3e27f70-6ea1-4f2e-a42e-e65c8ba76147","Type":"ContainerDied","Data":"834109a164cebc40b9da57c2e383b99039af503bfbbdbafe954a3b8ce2a534e3"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.062460 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="834109a164cebc40b9da57c2e383b99039af503bfbbdbafe954a3b8ce2a534e3" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.064667 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9vtvz" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.065541 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9vtvz" event={"ID":"94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6","Type":"ContainerDied","Data":"76ed179c77d4582dff3583fe5aa8861bdab403dbd995245cbd2608d01ae0d097"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.065586 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ed179c77d4582dff3583fe5aa8861bdab403dbd995245cbd2608d01ae0d097" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.068396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dl5n2" event={"ID":"47698618-487f-4849-b179-34398850f0e0","Type":"ContainerStarted","Data":"36bd24e1b5566dbbfb969007cb73ea97d71cecb278d83fa7485e1f5061b3eb1d"} Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.090823 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dl5n2" podStartSLOduration=6.168869552 podStartE2EDuration="11.090804816s" podCreationTimestamp="2026-03-21 05:06:26 +0000 UTC" firstStartedPulling="2026-03-21 05:06:31.705668213 +0000 UTC m=+1144.682131837" lastFinishedPulling="2026-03-21 05:06:36.627603477 +0000 UTC m=+1149.604067101" observedRunningTime="2026-03-21 05:06:37.087886354 +0000 UTC m=+1150.064349998" watchObservedRunningTime="2026-03-21 05:06:37.090804816 +0000 UTC m=+1150.067268440" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097559 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrlpt\" (UniqueName: \"kubernetes.io/projected/b6526a91-d74c-4b54-9393-979ed50d1df5-kube-api-access-zrlpt\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097588 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097600 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74dea94d-fe8b-49e3-9730-fbb641cafaab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097609 4775 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097622 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfdt5\" (UniqueName: \"kubernetes.io/projected/74dea94d-fe8b-49e3-9730-fbb641cafaab-kube-api-access-kfdt5\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097632 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097640 4775 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097679 4775 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6526a91-d74c-4b54-9393-979ed50d1df5-var-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097787 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sglvc\" (UniqueName: \"kubernetes.io/projected/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b-kube-api-access-sglvc\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.097802 4775 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6526a91-d74c-4b54-9393-979ed50d1df5-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:37 crc kubenswrapper[4775]: I0321 05:06:37.528657 4775 scope.go:117] "RemoveContainer" containerID="7d8eac46771bcd3a496da8e38bf914f7acad2a5e17c1713aea69b3a4abe8ee15" Mar 21 05:06:38 crc kubenswrapper[4775]: I0321 05:06:38.034653 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nmtjx-config-pjpqh"] Mar 21 05:06:38 crc kubenswrapper[4775]: I0321 05:06:38.046093 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nmtjx-config-pjpqh"] Mar 21 05:06:38 crc kubenswrapper[4775]: I0321 05:06:38.078445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"990c4464a56e9fdfe73e196f6716e0777acbe6f63422d3ab79b19fdbc492224b"} Mar 21 05:06:38 crc kubenswrapper[4775]: I0321 05:06:38.078495 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"9193bbebc1d7ac9486ecf1ebbb5784ab20e11ffe4c1d64be6b7773fd6661b433"} Mar 21 05:06:39 crc kubenswrapper[4775]: I0321 05:06:39.093421 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"243042feb3954e380e0eb5b0bc34136a7c3c9b637b3f94139529cf2dc89efe25"} Mar 21 05:06:39 crc kubenswrapper[4775]: I0321 05:06:39.094088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"a37a7dc2c9267a0e9e64d59e2a1136010658f04372c37deef124c10dfa468170"} Mar 21 05:06:39 crc kubenswrapper[4775]: I0321 05:06:39.672215 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6526a91-d74c-4b54-9393-979ed50d1df5" path="/var/lib/kubelet/pods/b6526a91-d74c-4b54-9393-979ed50d1df5/volumes" Mar 21 05:06:42 crc kubenswrapper[4775]: I0321 05:06:42.125188 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"3c5e645400033189bda6cf0d6f22c66470d7937542ebf820b4e888c1efdfd822"} Mar 21 05:06:42 crc kubenswrapper[4775]: I0321 05:06:42.125818 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"41f21e794ac223855a36a52e58e57a9292cbe65a75e68474ac07df421017d212"} Mar 21 05:06:42 crc kubenswrapper[4775]: I0321 05:06:42.125837 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"4b95d23f9850741608d8b143586b154ee4a4b9b98bdb9b7751b8aae5e94cf0f7"} Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.138002 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"9ccb53a7eb02a588222376fcfb6617e147754262db12f8a2ad409ec6e9924d5f"} Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.138339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"576404f60529e8c45704bba2171c3da8eb1caaaac0f8a7b830c16c1f6056451e"} Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.138353 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"c4bbf35a341f82d4c4680a9e20ce60c9d5acbc7d6b9e3c3b86a8d877b321b119"} Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.138363 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8e93b938-c138-4cfc-a227-e1cd648ad59a","Type":"ContainerStarted","Data":"dec5ec835fe61a6d902cb8de216092907ca1f69e3b0758ed5ea70a23c42d21bc"} Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.141223 4775 generic.go:334] "Generic (PLEG): container finished" podID="47698618-487f-4849-b179-34398850f0e0" containerID="36bd24e1b5566dbbfb969007cb73ea97d71cecb278d83fa7485e1f5061b3eb1d" exitCode=0 Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.141257 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dl5n2" event={"ID":"47698618-487f-4849-b179-34398850f0e0","Type":"ContainerDied","Data":"36bd24e1b5566dbbfb969007cb73ea97d71cecb278d83fa7485e1f5061b3eb1d"} Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.176077 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=44.462804215 podStartE2EDuration="54.176051559s" podCreationTimestamp="2026-03-21 05:05:49 +0000 UTC" firstStartedPulling="2026-03-21 05:06:31.79042419 +0000 UTC m=+1144.766887824" lastFinishedPulling="2026-03-21 05:06:41.503671544 +0000 UTC m=+1154.480135168" observedRunningTime="2026-03-21 05:06:43.172756675 +0000 UTC m=+1156.149220319" watchObservedRunningTime="2026-03-21 05:06:43.176051559 +0000 UTC m=+1156.152515183" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450174 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jhdxv"] Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450773 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6526a91-d74c-4b54-9393-979ed50d1df5" containerName="ovn-config" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450787 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6526a91-d74c-4b54-9393-979ed50d1df5" containerName="ovn-config" Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450801 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450807 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450814 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450820 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450836 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32831ef2-2e09-453b-9bea-dffe7423fa37" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450843 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32831ef2-2e09-453b-9bea-dffe7423fa37" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450855 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762f4dd9-9a96-4cdb-aa87-e181f5959140" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450860 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="762f4dd9-9a96-4cdb-aa87-e181f5959140" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450869 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3e27f70-6ea1-4f2e-a42e-e65c8ba76147" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450875 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3e27f70-6ea1-4f2e-a42e-e65c8ba76147" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450885 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74dea94d-fe8b-49e3-9730-fbb641cafaab" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450890 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="74dea94d-fe8b-49e3-9730-fbb641cafaab" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: E0321 05:06:43.450900 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c6072b-3aa5-43ef-be24-f9b20e5095bd" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.450907 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c6072b-3aa5-43ef-be24-f9b20e5095bd" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451040 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="32831ef2-2e09-453b-9bea-dffe7423fa37" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451049 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6526a91-d74c-4b54-9393-979ed50d1df5" containerName="ovn-config" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451060 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3e27f70-6ea1-4f2e-a42e-e65c8ba76147" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451070 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451078 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="74dea94d-fe8b-49e3-9730-fbb641cafaab" containerName="mariadb-database-create" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451086 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451096 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c6072b-3aa5-43ef-be24-f9b20e5095bd" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451107 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="762f4dd9-9a96-4cdb-aa87-e181f5959140" containerName="mariadb-account-create-update" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.451930 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.453887 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.467871 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jhdxv"] Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.523296 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s69ql\" (UniqueName: \"kubernetes.io/projected/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-kube-api-access-s69ql\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.523380 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.523434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.523461 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.523491 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-config\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.523523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.625570 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.625678 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.625709 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.625742 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-config\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.625773 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.625864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s69ql\" (UniqueName: \"kubernetes.io/projected/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-kube-api-access-s69ql\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.628098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.628848 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.629172 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.630136 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.630562 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-config\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.644576 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69ql\" (UniqueName: \"kubernetes.io/projected/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-kube-api-access-s69ql\") pod \"dnsmasq-dns-5c79d794d7-jhdxv\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:43 crc kubenswrapper[4775]: I0321 05:06:43.773605 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:44 crc kubenswrapper[4775]: I0321 05:06:44.211443 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jhdxv"] Mar 21 05:06:44 crc kubenswrapper[4775]: W0321 05:06:44.224533 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c4bdcc_f0d3_452d_87ac_52ce486f1c52.slice/crio-ddb13a088c51c9a2b5faad8f81230304cbd7e86c02e650cace18f83a00dc10db WatchSource:0}: Error finding container ddb13a088c51c9a2b5faad8f81230304cbd7e86c02e650cace18f83a00dc10db: Status 404 returned error can't find the container with id ddb13a088c51c9a2b5faad8f81230304cbd7e86c02e650cace18f83a00dc10db Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.431581 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.542006 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-combined-ca-bundle\") pod \"47698618-487f-4849-b179-34398850f0e0\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.542144 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-config-data\") pod \"47698618-487f-4849-b179-34398850f0e0\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.542170 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwhb4\" (UniqueName: \"kubernetes.io/projected/47698618-487f-4849-b179-34398850f0e0-kube-api-access-lwhb4\") pod \"47698618-487f-4849-b179-34398850f0e0\" (UID: \"47698618-487f-4849-b179-34398850f0e0\") " Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.545931 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47698618-487f-4849-b179-34398850f0e0-kube-api-access-lwhb4" (OuterVolumeSpecName: "kube-api-access-lwhb4") pod "47698618-487f-4849-b179-34398850f0e0" (UID: "47698618-487f-4849-b179-34398850f0e0"). InnerVolumeSpecName "kube-api-access-lwhb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.563312 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47698618-487f-4849-b179-34398850f0e0" (UID: "47698618-487f-4849-b179-34398850f0e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.586177 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-config-data" (OuterVolumeSpecName: "config-data") pod "47698618-487f-4849-b179-34398850f0e0" (UID: "47698618-487f-4849-b179-34398850f0e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.644443 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.644479 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47698618-487f-4849-b179-34398850f0e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:44.644493 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwhb4\" (UniqueName: \"kubernetes.io/projected/47698618-487f-4849-b179-34398850f0e0-kube-api-access-lwhb4\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.158292 4775 generic.go:334] "Generic (PLEG): container finished" podID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerID="11cb266c6b3aa03b4030e08a2478a4c8badfb6b932b90e369c8e339bd2194478" exitCode=0 Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.158394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" event={"ID":"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52","Type":"ContainerDied","Data":"11cb266c6b3aa03b4030e08a2478a4c8badfb6b932b90e369c8e339bd2194478"} Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.158418 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" event={"ID":"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52","Type":"ContainerStarted","Data":"ddb13a088c51c9a2b5faad8f81230304cbd7e86c02e650cace18f83a00dc10db"} Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.160272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dl5n2" event={"ID":"47698618-487f-4849-b179-34398850f0e0","Type":"ContainerDied","Data":"75aba715be05adf1dd95f2ce6be9dc16fa293f4f6be54a7933d8a4a59ff5f394"} Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.160302 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dl5n2" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.160304 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75aba715be05adf1dd95f2ce6be9dc16fa293f4f6be54a7933d8a4a59ff5f394" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.447763 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jhdxv"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.514245 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-t7qd7"] Mar 21 05:06:45 crc kubenswrapper[4775]: E0321 05:06:45.514827 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47698618-487f-4849-b179-34398850f0e0" containerName="keystone-db-sync" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.514908 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="47698618-487f-4849-b179-34398850f0e0" containerName="keystone-db-sync" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.515234 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="47698618-487f-4849-b179-34398850f0e0" containerName="keystone-db-sync" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.516435 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.531286 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kbtmv"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.532707 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.541223 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nhwlw" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.541441 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.541326 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.541394 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.541601 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.561751 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-t7qd7"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.579988 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kbtmv"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668378 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-svc\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668404 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m66r\" (UniqueName: \"kubernetes.io/projected/ed132640-db0b-458b-ba3f-d5aedb284447-kube-api-access-7m66r\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668445 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668498 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-config-data\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-fernet-keys\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668555 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znghz\" (UniqueName: \"kubernetes.io/projected/50f0ec15-decb-4879-9410-0962a21f83ae-kube-api-access-znghz\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668595 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-config\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668628 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-scripts\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668655 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-combined-ca-bundle\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.668706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-credential-keys\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.708427 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9fd475b9c-m48br"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.716860 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9fd475b9c-m48br"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.716982 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.725170 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wvjlb"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.726434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.735633 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.735794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pms25" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.738479 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.749678 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-l8qqc"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.752469 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.752710 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rcxc7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.752831 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.758334 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.760055 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-scripts\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-combined-ca-bundle\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772424 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772455 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-credential-keys\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772554 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-svc\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m66r\" (UniqueName: \"kubernetes.io/projected/ed132640-db0b-458b-ba3f-d5aedb284447-kube-api-access-7m66r\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772619 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-config-data\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772694 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-fernet-keys\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znghz\" (UniqueName: \"kubernetes.io/projected/50f0ec15-decb-4879-9410-0962a21f83ae-kube-api-access-znghz\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.772765 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-config\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.773872 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-config\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.774263 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.775334 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.775892 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-svc\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.810774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-config-data\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.811382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-fernet-keys\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.813246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.851492 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nm9jl" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.851719 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.851905 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.885334 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-combined-ca-bundle\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.885604 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-scripts\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf22x\" (UniqueName: \"kubernetes.io/projected/9531f5f9-8f77-4882-b779-4210b1de81de-kube-api-access-gf22x\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-config\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887324 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjzgj\" (UniqueName: \"kubernetes.io/projected/f281a5f0-270c-4d95-93b1-ad265786b886-kube-api-access-jjzgj\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281a5f0-270c-4d95-93b1-ad265786b886-logs\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887377 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-combined-ca-bundle\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-scripts\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-scripts\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887489 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-config-data\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-db-sync-config-data\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887629 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-config-data\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.887924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-764f7\" (UniqueName: \"kubernetes.io/projected/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-kube-api-access-764f7\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.888002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f281a5f0-270c-4d95-93b1-ad265786b886-horizon-secret-key\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.888044 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-etc-machine-id\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.888089 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-combined-ca-bundle\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.891777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znghz\" (UniqueName: \"kubernetes.io/projected/50f0ec15-decb-4879-9410-0962a21f83ae-kube-api-access-znghz\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.905753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-credential-keys\") pod \"keystone-bootstrap-kbtmv\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.929179 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-l8qqc"] Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.929720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m66r\" (UniqueName: \"kubernetes.io/projected/ed132640-db0b-458b-ba3f-d5aedb284447-kube-api-access-7m66r\") pod \"dnsmasq-dns-5b868669f-t7qd7\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:45 crc kubenswrapper[4775]: I0321 05:06:45.953130 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wvjlb"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.006932 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.006975 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f281a5f0-270c-4d95-93b1-ad265786b886-horizon-secret-key\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.007050 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-etc-machine-id\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.007089 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-combined-ca-bundle\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.035019 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-etc-machine-id\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041233 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf22x\" (UniqueName: \"kubernetes.io/projected/9531f5f9-8f77-4882-b779-4210b1de81de-kube-api-access-gf22x\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041330 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-config\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjzgj\" (UniqueName: \"kubernetes.io/projected/f281a5f0-270c-4d95-93b1-ad265786b886-kube-api-access-jjzgj\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281a5f0-270c-4d95-93b1-ad265786b886-logs\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-combined-ca-bundle\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-scripts\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041501 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-scripts\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-config-data\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041621 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-db-sync-config-data\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-config-data\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.041866 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-764f7\" (UniqueName: \"kubernetes.io/projected/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-kube-api-access-764f7\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.044192 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.046331 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-config-data\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.046911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-scripts\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.048777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f281a5f0-270c-4d95-93b1-ad265786b886-horizon-secret-key\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.049840 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281a5f0-270c-4d95-93b1-ad265786b886-logs\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.061630 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-combined-ca-bundle\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.062660 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.067832 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.072338 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-config-data\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.074413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-scripts\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.078566 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-db-sync-config-data\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.080345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-config\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.106352 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-combined-ca-bundle\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.106599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf22x\" (UniqueName: \"kubernetes.io/projected/9531f5f9-8f77-4882-b779-4210b1de81de-kube-api-access-gf22x\") pod \"neutron-db-sync-wvjlb\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.108296 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-764f7\" (UniqueName: \"kubernetes.io/projected/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-kube-api-access-764f7\") pod \"cinder-db-sync-l8qqc\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.121868 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.130785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjzgj\" (UniqueName: \"kubernetes.io/projected/f281a5f0-270c-4d95-93b1-ad265786b886-kube-api-access-jjzgj\") pod \"horizon-9fd475b9c-m48br\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.135233 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bc489768c-rlbc8"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.137022 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.150902 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.176752 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-run-httpd\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.177018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-log-httpd\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.177321 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8h8k\" (UniqueName: \"kubernetes.io/projected/8ac5c16c-56ac-4299-ae61-a8200986ce10-kube-api-access-w8h8k\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.177500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.177594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-scripts\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.177682 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.177779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-config-data\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.178825 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.229833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" event={"ID":"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52","Type":"ContainerStarted","Data":"44277cb33149a33c5c08b0bf964d1c1e6f322cfb36b55ecaba7cde47956ea895"} Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.230825 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.236596 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" podUID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerName="dnsmasq-dns" containerID="cri-o://44277cb33149a33c5c08b0bf964d1c1e6f322cfb36b55ecaba7cde47956ea895" gracePeriod=10 Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.247690 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bc489768c-rlbc8"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283169 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-scripts\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-config-data\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283313 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-run-httpd\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-log-httpd\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tstsp\" (UniqueName: \"kubernetes.io/projected/44b58fbc-be53-4531-9dee-05ba398f25ef-kube-api-access-tstsp\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283398 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44b58fbc-be53-4531-9dee-05ba398f25ef-horizon-secret-key\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283420 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-scripts\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-config-data\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283464 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b58fbc-be53-4531-9dee-05ba398f25ef-logs\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8h8k\" (UniqueName: \"kubernetes.io/projected/8ac5c16c-56ac-4299-ae61-a8200986ce10-kube-api-access-w8h8k\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.283552 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.284690 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-log-httpd\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.284837 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-run-httpd\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.285494 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-t7qd7"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.286042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.290026 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-scripts\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.291975 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.292035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.293318 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-config-data\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.298972 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dh4pj"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.300298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.304455 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8q2g8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.304599 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.304729 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.307053 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8h8k\" (UniqueName: \"kubernetes.io/projected/8ac5c16c-56ac-4299-ae61-a8200986ce10-kube-api-access-w8h8k\") pod \"ceilometer-0\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.309927 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9cmq2"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.311703 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.316504 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.316710 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ph6zz" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.341331 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9cmq2"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.351531 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-4nfhc"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.353094 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.364538 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dh4pj"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.364957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386103 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-db-sync-config-data\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386263 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bczb\" (UniqueName: \"kubernetes.io/projected/1658991f-7a2b-4ce9-a240-a940385e0b8f-kube-api-access-9bczb\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tstsp\" (UniqueName: \"kubernetes.io/projected/44b58fbc-be53-4531-9dee-05ba398f25ef-kube-api-access-tstsp\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386399 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44b58fbc-be53-4531-9dee-05ba398f25ef-horizon-secret-key\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-scripts\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386484 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-config-data\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b58fbc-be53-4531-9dee-05ba398f25ef-logs\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386575 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-config-data\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386628 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgm22\" (UniqueName: \"kubernetes.io/projected/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-kube-api-access-sgm22\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386665 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-combined-ca-bundle\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386724 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-scripts\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386748 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-logs\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.386805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-combined-ca-bundle\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.388490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b58fbc-be53-4531-9dee-05ba398f25ef-logs\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.388678 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-scripts\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.390792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-config-data\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.396457 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-4nfhc"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.399381 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44b58fbc-be53-4531-9dee-05ba398f25ef-horizon-secret-key\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.403554 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.415866 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" podStartSLOduration=3.415841081 podStartE2EDuration="3.415841081s" podCreationTimestamp="2026-03-21 05:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:46.251807552 +0000 UTC m=+1159.228271186" watchObservedRunningTime="2026-03-21 05:06:46.415841081 +0000 UTC m=+1159.392304705" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.419839 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tstsp\" (UniqueName: \"kubernetes.io/projected/44b58fbc-be53-4531-9dee-05ba398f25ef-kube-api-access-tstsp\") pod \"horizon-7bc489768c-rlbc8\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.450077 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.466503 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.492763 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.492887 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-config-data\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.492936 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgm22\" (UniqueName: \"kubernetes.io/projected/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-kube-api-access-sgm22\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.492960 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.492987 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493025 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-combined-ca-bundle\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-scripts\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493101 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-logs\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-combined-ca-bundle\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493266 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-config\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493520 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzx7f\" (UniqueName: \"kubernetes.io/projected/aaad9349-8b1e-4f07-b3c9-36bc4781a386-kube-api-access-lzx7f\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-db-sync-config-data\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493639 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-svc\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.493800 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bczb\" (UniqueName: \"kubernetes.io/projected/1658991f-7a2b-4ce9-a240-a940385e0b8f-kube-api-access-9bczb\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.498201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-scripts\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.500790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-db-sync-config-data\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.502764 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-config-data\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.505339 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-combined-ca-bundle\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.506083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-logs\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.507017 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-combined-ca-bundle\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.521083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bczb\" (UniqueName: \"kubernetes.io/projected/1658991f-7a2b-4ce9-a240-a940385e0b8f-kube-api-access-9bczb\") pod \"barbican-db-sync-9cmq2\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.524880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgm22\" (UniqueName: \"kubernetes.io/projected/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-kube-api-access-sgm22\") pod \"placement-db-sync-dh4pj\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.580779 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-t7qd7"] Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.595066 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.595110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.595207 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-config\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.595255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzx7f\" (UniqueName: \"kubernetes.io/projected/aaad9349-8b1e-4f07-b3c9-36bc4781a386-kube-api-access-lzx7f\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.595294 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-svc\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.595368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.596445 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.597064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.597578 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.598149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-config\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.598929 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-svc\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.623021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzx7f\" (UniqueName: \"kubernetes.io/projected/aaad9349-8b1e-4f07-b3c9-36bc4781a386-kube-api-access-lzx7f\") pod \"dnsmasq-dns-cf78879c9-4nfhc\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.632580 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh4pj" Mar 21 05:06:46 crc kubenswrapper[4775]: W0321 05:06:46.634495 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded132640_db0b_458b_ba3f_d5aedb284447.slice/crio-80631c4a829e6fbdb4bacbdb17279550e80269d3798a6721d9431adbccaec3b9 WatchSource:0}: Error finding container 80631c4a829e6fbdb4bacbdb17279550e80269d3798a6721d9431adbccaec3b9: Status 404 returned error can't find the container with id 80631c4a829e6fbdb4bacbdb17279550e80269d3798a6721d9431adbccaec3b9 Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.653723 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:06:46 crc kubenswrapper[4775]: I0321 05:06:46.699195 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.006370 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9fd475b9c-m48br"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.014148 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wvjlb"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.026579 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kbtmv"] Mar 21 05:06:47 crc kubenswrapper[4775]: W0321 05:06:47.029975 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37bb7e34_ac47_44f6_b18f_ef4ed78eea6a.slice/crio-45cbfcc3ba69b1d165e2a6acc84f20765ed5bc3bad1c098418f5af9089eb0bf1 WatchSource:0}: Error finding container 45cbfcc3ba69b1d165e2a6acc84f20765ed5bc3bad1c098418f5af9089eb0bf1: Status 404 returned error can't find the container with id 45cbfcc3ba69b1d165e2a6acc84f20765ed5bc3bad1c098418f5af9089eb0bf1 Mar 21 05:06:47 crc kubenswrapper[4775]: W0321 05:06:47.031794 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf281a5f0_270c_4d95_93b1_ad265786b886.slice/crio-3ed2947533338d4f3898375bd2e8d20bf0cdf56e90ab25944e2ce0199f079b4a WatchSource:0}: Error finding container 3ed2947533338d4f3898375bd2e8d20bf0cdf56e90ab25944e2ce0199f079b4a: Status 404 returned error can't find the container with id 3ed2947533338d4f3898375bd2e8d20bf0cdf56e90ab25944e2ce0199f079b4a Mar 21 05:06:47 crc kubenswrapper[4775]: W0321 05:06:47.044801 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9531f5f9_8f77_4882_b779_4210b1de81de.slice/crio-a0dba2a88554c5f818ac14151546ad519e642afe620cd4a8580aa68fd7107a3c WatchSource:0}: Error finding container a0dba2a88554c5f818ac14151546ad519e642afe620cd4a8580aa68fd7107a3c: Status 404 returned error can't find the container with id a0dba2a88554c5f818ac14151546ad519e642afe620cd4a8580aa68fd7107a3c Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.050791 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-l8qqc"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.187332 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bc489768c-rlbc8"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.244351 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.247728 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9fd475b9c-m48br" event={"ID":"f281a5f0-270c-4d95-93b1-ad265786b886","Type":"ContainerStarted","Data":"3ed2947533338d4f3898375bd2e8d20bf0cdf56e90ab25944e2ce0199f079b4a"} Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.249592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l8qqc" event={"ID":"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a","Type":"ContainerStarted","Data":"45cbfcc3ba69b1d165e2a6acc84f20765ed5bc3bad1c098418f5af9089eb0bf1"} Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.251178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wvjlb" event={"ID":"9531f5f9-8f77-4882-b779-4210b1de81de","Type":"ContainerStarted","Data":"a0dba2a88554c5f818ac14151546ad519e642afe620cd4a8580aa68fd7107a3c"} Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.252587 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-t7qd7" event={"ID":"ed132640-db0b-458b-ba3f-d5aedb284447","Type":"ContainerStarted","Data":"80631c4a829e6fbdb4bacbdb17279550e80269d3798a6721d9431adbccaec3b9"} Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.259686 4775 generic.go:334] "Generic (PLEG): container finished" podID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerID="44277cb33149a33c5c08b0bf964d1c1e6f322cfb36b55ecaba7cde47956ea895" exitCode=0 Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.259825 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" event={"ID":"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52","Type":"ContainerDied","Data":"44277cb33149a33c5c08b0bf964d1c1e6f322cfb36b55ecaba7cde47956ea895"} Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.279857 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbtmv" event={"ID":"50f0ec15-decb-4879-9410-0962a21f83ae","Type":"ContainerStarted","Data":"241678b559ba9174070a5793922c2b069fb839b133fbc00760a37f9bf9777ed9"} Mar 21 05:06:47 crc kubenswrapper[4775]: W0321 05:06:47.435706 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cd9d81f_8c8a_46f9_9943_42dc0b638bef.slice/crio-f10dda8724855df58c36f76f7e138a9aa8f4ab4f82a8c7d9fc8bb963f6bad44c WatchSource:0}: Error finding container f10dda8724855df58c36f76f7e138a9aa8f4ab4f82a8c7d9fc8bb963f6bad44c: Status 404 returned error can't find the container with id f10dda8724855df58c36f76f7e138a9aa8f4ab4f82a8c7d9fc8bb963f6bad44c Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.448893 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dh4pj"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.486243 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9cmq2"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.514962 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bc489768c-rlbc8"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.527585 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c4c5b5499-jnmhl"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.529376 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.534088 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c4c5b5499-jnmhl"] Mar 21 05:06:47 crc kubenswrapper[4775]: W0321 05:06:47.547084 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaad9349_8b1e_4f07_b3c9_36bc4781a386.slice/crio-545d0a4dfbecffb0f9c29e4cf099e382302c9c5682c08fe8a7fbe8e6c0247a35 WatchSource:0}: Error finding container 545d0a4dfbecffb0f9c29e4cf099e382302c9c5682c08fe8a7fbe8e6c0247a35: Status 404 returned error can't find the container with id 545d0a4dfbecffb0f9c29e4cf099e382302c9c5682c08fe8a7fbe8e6c0247a35 Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.567713 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-4nfhc"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.572975 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.588891 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.732483 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-swift-storage-0\") pod \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.732669 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-nb\") pod \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.732707 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-config\") pod \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.732782 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-svc\") pod \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.732861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s69ql\" (UniqueName: \"kubernetes.io/projected/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-kube-api-access-s69ql\") pod \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.733030 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-sb\") pod \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\" (UID: \"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52\") " Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.733460 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbdnp\" (UniqueName: \"kubernetes.io/projected/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-kube-api-access-rbdnp\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.733568 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-scripts\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.733632 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-logs\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.733981 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-config-data\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.734026 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-horizon-secret-key\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.758163 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-kube-api-access-s69ql" (OuterVolumeSpecName: "kube-api-access-s69ql") pod "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" (UID: "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52"). InnerVolumeSpecName "kube-api-access-s69ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.835416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-config-data\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.841548 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-config-data\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.841696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-horizon-secret-key\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.842468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbdnp\" (UniqueName: \"kubernetes.io/projected/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-kube-api-access-rbdnp\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.842642 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-scripts\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.842700 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-logs\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.842975 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s69ql\" (UniqueName: \"kubernetes.io/projected/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-kube-api-access-s69ql\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.843798 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-scripts\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.843801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-logs\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.849345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-horizon-secret-key\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.880754 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbdnp\" (UniqueName: \"kubernetes.io/projected/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-kube-api-access-rbdnp\") pod \"horizon-6c4c5b5499-jnmhl\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:47 crc kubenswrapper[4775]: I0321 05:06:47.898025 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.042515 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" (UID: "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.048790 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-config" (OuterVolumeSpecName: "config") pod "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" (UID: "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.049916 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.049931 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.052670 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" (UID: "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.052837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" (UID: "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.062605 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" (UID: "e2c4bdcc-f0d3-452d-87ac-52ce486f1c52"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.151178 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.151219 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.151230 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.308628 4775 generic.go:334] "Generic (PLEG): container finished" podID="ed132640-db0b-458b-ba3f-d5aedb284447" containerID="0b3ac32d8cf0aa92220dc7dc36f6c6433239ed9b4e495f670922d65ddd3a5492" exitCode=0 Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.308917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-t7qd7" event={"ID":"ed132640-db0b-458b-ba3f-d5aedb284447","Type":"ContainerDied","Data":"0b3ac32d8cf0aa92220dc7dc36f6c6433239ed9b4e495f670922d65ddd3a5492"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.314304 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh4pj" event={"ID":"5cd9d81f-8c8a-46f9-9943-42dc0b638bef","Type":"ContainerStarted","Data":"f10dda8724855df58c36f76f7e138a9aa8f4ab4f82a8c7d9fc8bb963f6bad44c"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.317736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" event={"ID":"e2c4bdcc-f0d3-452d-87ac-52ce486f1c52","Type":"ContainerDied","Data":"ddb13a088c51c9a2b5faad8f81230304cbd7e86c02e650cace18f83a00dc10db"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.317794 4775 scope.go:117] "RemoveContainer" containerID="44277cb33149a33c5c08b0bf964d1c1e6f322cfb36b55ecaba7cde47956ea895" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.317914 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jhdxv" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.333630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wvjlb" event={"ID":"9531f5f9-8f77-4882-b779-4210b1de81de","Type":"ContainerStarted","Data":"f19a3eb592d811463bd26fefd306c16b1ec6187913716e5585ce5b02489e2c0c"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.341013 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cmq2" event={"ID":"1658991f-7a2b-4ce9-a240-a940385e0b8f","Type":"ContainerStarted","Data":"cb42616105e59d6830a168b6da9f0c076bcf19cd878442d47ceb6cd33726aca0"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.342754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbtmv" event={"ID":"50f0ec15-decb-4879-9410-0962a21f83ae","Type":"ContainerStarted","Data":"fdcd202238496f57430b4ad2beaac05c4b222263df155d2d3bab8e200bbd88c5"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.345339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bc489768c-rlbc8" event={"ID":"44b58fbc-be53-4531-9dee-05ba398f25ef","Type":"ContainerStarted","Data":"e60f7afbe17b7aa1bbfaf8e491a43aeb53519de12ebcef7dcaf29708ec522108"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.354021 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerStarted","Data":"25ef4b17ffe6969bdd8510a4d651e67244e698c334397d84b7e113488d96c6fb"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.365694 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wvjlb" podStartSLOduration=3.365671423 podStartE2EDuration="3.365671423s" podCreationTimestamp="2026-03-21 05:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:48.349219568 +0000 UTC m=+1161.325683202" watchObservedRunningTime="2026-03-21 05:06:48.365671423 +0000 UTC m=+1161.342135047" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.374611 4775 scope.go:117] "RemoveContainer" containerID="11cb266c6b3aa03b4030e08a2478a4c8badfb6b932b90e369c8e339bd2194478" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.374673 4775 generic.go:334] "Generic (PLEG): container finished" podID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerID="4bd42a0432eaefd26033f0ffdb594981243afbef4a05205089435812a0be336b" exitCode=0 Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.374722 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" event={"ID":"aaad9349-8b1e-4f07-b3c9-36bc4781a386","Type":"ContainerDied","Data":"4bd42a0432eaefd26033f0ffdb594981243afbef4a05205089435812a0be336b"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.374754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" event={"ID":"aaad9349-8b1e-4f07-b3c9-36bc4781a386","Type":"ContainerStarted","Data":"545d0a4dfbecffb0f9c29e4cf099e382302c9c5682c08fe8a7fbe8e6c0247a35"} Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.384160 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jhdxv"] Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.391373 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jhdxv"] Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.399447 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kbtmv" podStartSLOduration=3.399434588 podStartE2EDuration="3.399434588s" podCreationTimestamp="2026-03-21 05:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:48.396351971 +0000 UTC m=+1161.372815605" watchObservedRunningTime="2026-03-21 05:06:48.399434588 +0000 UTC m=+1161.375898212" Mar 21 05:06:48 crc kubenswrapper[4775]: I0321 05:06:48.480361 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c4c5b5499-jnmhl"] Mar 21 05:06:49 crc kubenswrapper[4775]: I0321 05:06:49.674661 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" path="/var/lib/kubelet/pods/e2c4bdcc-f0d3-452d-87ac-52ce486f1c52/volumes" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.404268 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c4c5b5499-jnmhl" event={"ID":"e7fbefc1-c1f6-472d-82b4-77c0a44b514f","Type":"ContainerStarted","Data":"4d928750acf27a12d43a71655b2e716129e9bfb358dbdddb74f0ae57f062cbba"} Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.405296 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.406080 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-t7qd7" event={"ID":"ed132640-db0b-458b-ba3f-d5aedb284447","Type":"ContainerDied","Data":"80631c4a829e6fbdb4bacbdb17279550e80269d3798a6721d9431adbccaec3b9"} Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.406147 4775 scope.go:117] "RemoveContainer" containerID="0b3ac32d8cf0aa92220dc7dc36f6c6433239ed9b4e495f670922d65ddd3a5492" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.519501 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-nb\") pod \"ed132640-db0b-458b-ba3f-d5aedb284447\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.519621 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-config\") pod \"ed132640-db0b-458b-ba3f-d5aedb284447\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.519679 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m66r\" (UniqueName: \"kubernetes.io/projected/ed132640-db0b-458b-ba3f-d5aedb284447-kube-api-access-7m66r\") pod \"ed132640-db0b-458b-ba3f-d5aedb284447\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.519731 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-svc\") pod \"ed132640-db0b-458b-ba3f-d5aedb284447\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.519790 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-swift-storage-0\") pod \"ed132640-db0b-458b-ba3f-d5aedb284447\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.519832 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-sb\") pod \"ed132640-db0b-458b-ba3f-d5aedb284447\" (UID: \"ed132640-db0b-458b-ba3f-d5aedb284447\") " Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.526975 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed132640-db0b-458b-ba3f-d5aedb284447-kube-api-access-7m66r" (OuterVolumeSpecName: "kube-api-access-7m66r") pod "ed132640-db0b-458b-ba3f-d5aedb284447" (UID: "ed132640-db0b-458b-ba3f-d5aedb284447"). InnerVolumeSpecName "kube-api-access-7m66r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.546849 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed132640-db0b-458b-ba3f-d5aedb284447" (UID: "ed132640-db0b-458b-ba3f-d5aedb284447"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.550562 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed132640-db0b-458b-ba3f-d5aedb284447" (UID: "ed132640-db0b-458b-ba3f-d5aedb284447"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.553637 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed132640-db0b-458b-ba3f-d5aedb284447" (UID: "ed132640-db0b-458b-ba3f-d5aedb284447"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.554576 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed132640-db0b-458b-ba3f-d5aedb284447" (UID: "ed132640-db0b-458b-ba3f-d5aedb284447"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.561306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-config" (OuterVolumeSpecName: "config") pod "ed132640-db0b-458b-ba3f-d5aedb284447" (UID: "ed132640-db0b-458b-ba3f-d5aedb284447"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.621655 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.621699 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m66r\" (UniqueName: \"kubernetes.io/projected/ed132640-db0b-458b-ba3f-d5aedb284447-kube-api-access-7m66r\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.621714 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.621729 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.621744 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:50 crc kubenswrapper[4775]: I0321 05:06:50.621756 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed132640-db0b-458b-ba3f-d5aedb284447-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.456231 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-td76j" event={"ID":"716605f1-5111-4e7a-9591-18dfb5da1984","Type":"ContainerStarted","Data":"43def41a0cf1f318d43db6319e03e84d1a862d118a43d560e7c5e1d51f26dead"} Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.462337 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-t7qd7" Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.465171 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" event={"ID":"aaad9349-8b1e-4f07-b3c9-36bc4781a386","Type":"ContainerStarted","Data":"f2fafd7848327783c5d640051878058b230f83637eec544e85f17df14ee71f99"} Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.465757 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.476754 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-td76j" podStartSLOduration=2.708029554 podStartE2EDuration="39.476733655s" podCreationTimestamp="2026-03-21 05:06:12 +0000 UTC" firstStartedPulling="2026-03-21 05:06:13.503500319 +0000 UTC m=+1126.479963943" lastFinishedPulling="2026-03-21 05:06:50.27220442 +0000 UTC m=+1163.248668044" observedRunningTime="2026-03-21 05:06:51.472502625 +0000 UTC m=+1164.448966249" watchObservedRunningTime="2026-03-21 05:06:51.476733655 +0000 UTC m=+1164.453197279" Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.504480 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" podStartSLOduration=5.504459889 podStartE2EDuration="5.504459889s" podCreationTimestamp="2026-03-21 05:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:06:51.491458161 +0000 UTC m=+1164.467921795" watchObservedRunningTime="2026-03-21 05:06:51.504459889 +0000 UTC m=+1164.480923513" Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.561587 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-t7qd7"] Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.575830 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-t7qd7"] Mar 21 05:06:51 crc kubenswrapper[4775]: I0321 05:06:51.693267 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed132640-db0b-458b-ba3f-d5aedb284447" path="/var/lib/kubelet/pods/ed132640-db0b-458b-ba3f-d5aedb284447/volumes" Mar 21 05:06:53 crc kubenswrapper[4775]: I0321 05:06:53.483921 4775 generic.go:334] "Generic (PLEG): container finished" podID="50f0ec15-decb-4879-9410-0962a21f83ae" containerID="fdcd202238496f57430b4ad2beaac05c4b222263df155d2d3bab8e200bbd88c5" exitCode=0 Mar 21 05:06:53 crc kubenswrapper[4775]: I0321 05:06:53.483971 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbtmv" event={"ID":"50f0ec15-decb-4879-9410-0962a21f83ae","Type":"ContainerDied","Data":"fdcd202238496f57430b4ad2beaac05c4b222263df155d2d3bab8e200bbd88c5"} Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.572960 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9fd475b9c-m48br"] Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.634085 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64fb567758-hd2ld"] Mar 21 05:06:54 crc kubenswrapper[4775]: E0321 05:06:54.634880 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed132640-db0b-458b-ba3f-d5aedb284447" containerName="init" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.634958 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed132640-db0b-458b-ba3f-d5aedb284447" containerName="init" Mar 21 05:06:54 crc kubenswrapper[4775]: E0321 05:06:54.635027 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerName="init" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.637474 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerName="init" Mar 21 05:06:54 crc kubenswrapper[4775]: E0321 05:06:54.637645 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerName="dnsmasq-dns" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.637722 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerName="dnsmasq-dns" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.638044 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c4bdcc-f0d3-452d-87ac-52ce486f1c52" containerName="dnsmasq-dns" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.638109 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed132640-db0b-458b-ba3f-d5aedb284447" containerName="init" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.639419 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.643437 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.664313 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64fb567758-hd2ld"] Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.679832 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c4c5b5499-jnmhl"] Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.700344 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6496ddbdd4-v5mc5"] Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.701680 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.721914 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6496ddbdd4-v5mc5"] Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.731795 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmrs\" (UniqueName: \"kubernetes.io/projected/fc6e433f-9e70-4b09-9780-403634bbe0dc-kube-api-access-mdmrs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-config-data\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-horizon-secret-key\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732320 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6e433f-9e70-4b09-9780-403634bbe0dc-logs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cf3763-0b41-4452-a247-d9a56f58b05d-logs\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732532 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc6e433f-9e70-4b09-9780-403634bbe0dc-config-data\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732616 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-tls-certs\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732699 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-combined-ca-bundle\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732823 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6e433f-9e70-4b09-9780-403634bbe0dc-scripts\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.732918 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-combined-ca-bundle\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.733004 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-horizon-tls-certs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.733162 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-scripts\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.733272 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4gg\" (UniqueName: \"kubernetes.io/projected/09cf3763-0b41-4452-a247-d9a56f58b05d-kube-api-access-rg4gg\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.733371 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-secret-key\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.833967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cf3763-0b41-4452-a247-d9a56f58b05d-logs\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834041 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-tls-certs\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834074 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc6e433f-9e70-4b09-9780-403634bbe0dc-config-data\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834096 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-combined-ca-bundle\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6e433f-9e70-4b09-9780-403634bbe0dc-scripts\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834189 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-combined-ca-bundle\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-horizon-tls-certs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834329 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-scripts\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834365 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4gg\" (UniqueName: \"kubernetes.io/projected/09cf3763-0b41-4452-a247-d9a56f58b05d-kube-api-access-rg4gg\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-secret-key\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834445 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmrs\" (UniqueName: \"kubernetes.io/projected/fc6e433f-9e70-4b09-9780-403634bbe0dc-kube-api-access-mdmrs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-config-data\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-horizon-secret-key\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6e433f-9e70-4b09-9780-403634bbe0dc-logs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.834708 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cf3763-0b41-4452-a247-d9a56f58b05d-logs\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.835198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6e433f-9e70-4b09-9780-403634bbe0dc-logs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.836255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fc6e433f-9e70-4b09-9780-403634bbe0dc-config-data\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.836947 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-scripts\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.837066 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6e433f-9e70-4b09-9780-403634bbe0dc-scripts\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.837297 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-config-data\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.843072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-horizon-secret-key\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.844193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-secret-key\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.849212 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-combined-ca-bundle\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.853336 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-combined-ca-bundle\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.853751 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6e433f-9e70-4b09-9780-403634bbe0dc-horizon-tls-certs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.857423 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-tls-certs\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.857874 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4gg\" (UniqueName: \"kubernetes.io/projected/09cf3763-0b41-4452-a247-d9a56f58b05d-kube-api-access-rg4gg\") pod \"horizon-64fb567758-hd2ld\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.861972 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmrs\" (UniqueName: \"kubernetes.io/projected/fc6e433f-9e70-4b09-9780-403634bbe0dc-kube-api-access-mdmrs\") pod \"horizon-6496ddbdd4-v5mc5\" (UID: \"fc6e433f-9e70-4b09-9780-403634bbe0dc\") " pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:54 crc kubenswrapper[4775]: I0321 05:06:54.978646 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:06:55 crc kubenswrapper[4775]: I0321 05:06:55.031531 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:06:56 crc kubenswrapper[4775]: I0321 05:06:56.708202 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:06:56 crc kubenswrapper[4775]: I0321 05:06:56.767552 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-b4m98"] Mar 21 05:06:56 crc kubenswrapper[4775]: I0321 05:06:56.768193 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" containerID="cri-o://6c721bd1cf19d6ad1fc2eb055e74ccf9afea13852ad17eeb02e87ea598c42465" gracePeriod=10 Mar 21 05:06:58 crc kubenswrapper[4775]: I0321 05:06:58.544644 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerID="6c721bd1cf19d6ad1fc2eb055e74ccf9afea13852ad17eeb02e87ea598c42465" exitCode=0 Mar 21 05:06:58 crc kubenswrapper[4775]: I0321 05:06:58.544759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" event={"ID":"6b50afbd-31b8-40ff-bd7b-1ce5021e2837","Type":"ContainerDied","Data":"6c721bd1cf19d6ad1fc2eb055e74ccf9afea13852ad17eeb02e87ea598c42465"} Mar 21 05:07:01 crc kubenswrapper[4775]: I0321 05:07:01.095021 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 05:07:02 crc kubenswrapper[4775]: I0321 05:07:02.482870 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:07:02 crc kubenswrapper[4775]: I0321 05:07:02.482946 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:07:06 crc kubenswrapper[4775]: I0321 05:07:06.095513 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 05:07:06 crc kubenswrapper[4775]: E0321 05:07:06.719655 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 05:07:06 crc kubenswrapper[4775]: E0321 05:07:06.720313 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5h5bbh9bh666h594h66fhcch68bh5dch9bh649hcch5fh9dh676h79h698h79h687h8hf7h74h5dh57h6ch655h5b4h86h554h664hdch647q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jjzgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-9fd475b9c-m48br_openstack(f281a5f0-270c-4d95-93b1-ad265786b886): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:07:06 crc kubenswrapper[4775]: E0321 05:07:06.722983 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-9fd475b9c-m48br" podUID="f281a5f0-270c-4d95-93b1-ad265786b886" Mar 21 05:07:07 crc kubenswrapper[4775]: E0321 05:07:07.024156 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 21 05:07:07 crc kubenswrapper[4775]: E0321 05:07:07.024353 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b5h568h666hcfh5ch64bh67ch75h96h645h697h66fh58fh56fh548hf9h554h5c8h669h588h75hbbh64dh596h689h585hb5h68dh87h579h69h5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8h8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(8ac5c16c-56ac-4299-ae61-a8200986ce10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:07:16 crc kubenswrapper[4775]: I0321 05:07:16.094846 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 21 05:07:16 crc kubenswrapper[4775]: I0321 05:07:16.095561 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.095821 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.791663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kbtmv" event={"ID":"50f0ec15-decb-4879-9410-0962a21f83ae","Type":"ContainerDied","Data":"241678b559ba9174070a5793922c2b069fb839b133fbc00760a37f9bf9777ed9"} Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.791927 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="241678b559ba9174070a5793922c2b069fb839b133fbc00760a37f9bf9777ed9" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.803427 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.887855 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-combined-ca-bundle\") pod \"50f0ec15-decb-4879-9410-0962a21f83ae\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.887929 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-fernet-keys\") pod \"50f0ec15-decb-4879-9410-0962a21f83ae\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.888012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-credential-keys\") pod \"50f0ec15-decb-4879-9410-0962a21f83ae\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.888047 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-scripts\") pod \"50f0ec15-decb-4879-9410-0962a21f83ae\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.888108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-config-data\") pod \"50f0ec15-decb-4879-9410-0962a21f83ae\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.888157 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znghz\" (UniqueName: \"kubernetes.io/projected/50f0ec15-decb-4879-9410-0962a21f83ae-kube-api-access-znghz\") pod \"50f0ec15-decb-4879-9410-0962a21f83ae\" (UID: \"50f0ec15-decb-4879-9410-0962a21f83ae\") " Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.894539 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "50f0ec15-decb-4879-9410-0962a21f83ae" (UID: "50f0ec15-decb-4879-9410-0962a21f83ae"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.894788 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-scripts" (OuterVolumeSpecName: "scripts") pod "50f0ec15-decb-4879-9410-0962a21f83ae" (UID: "50f0ec15-decb-4879-9410-0962a21f83ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.895450 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f0ec15-decb-4879-9410-0962a21f83ae-kube-api-access-znghz" (OuterVolumeSpecName: "kube-api-access-znghz") pod "50f0ec15-decb-4879-9410-0962a21f83ae" (UID: "50f0ec15-decb-4879-9410-0962a21f83ae"). InnerVolumeSpecName "kube-api-access-znghz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.898302 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "50f0ec15-decb-4879-9410-0962a21f83ae" (UID: "50f0ec15-decb-4879-9410-0962a21f83ae"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.918789 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50f0ec15-decb-4879-9410-0962a21f83ae" (UID: "50f0ec15-decb-4879-9410-0962a21f83ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.936850 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-config-data" (OuterVolumeSpecName: "config-data") pod "50f0ec15-decb-4879-9410-0962a21f83ae" (UID: "50f0ec15-decb-4879-9410-0962a21f83ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.989834 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znghz\" (UniqueName: \"kubernetes.io/projected/50f0ec15-decb-4879-9410-0962a21f83ae-kube-api-access-znghz\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.989865 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.989875 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.989883 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.989892 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:21 crc kubenswrapper[4775]: I0321 05:07:21.989900 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50f0ec15-decb-4879-9410-0962a21f83ae-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:22 crc kubenswrapper[4775]: I0321 05:07:22.800194 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kbtmv" Mar 21 05:07:22 crc kubenswrapper[4775]: I0321 05:07:22.898130 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kbtmv"] Mar 21 05:07:22 crc kubenswrapper[4775]: I0321 05:07:22.906312 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kbtmv"] Mar 21 05:07:22 crc kubenswrapper[4775]: I0321 05:07:22.994614 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jxmtf"] Mar 21 05:07:22 crc kubenswrapper[4775]: E0321 05:07:22.994974 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f0ec15-decb-4879-9410-0962a21f83ae" containerName="keystone-bootstrap" Mar 21 05:07:22 crc kubenswrapper[4775]: I0321 05:07:22.994986 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f0ec15-decb-4879-9410-0962a21f83ae" containerName="keystone-bootstrap" Mar 21 05:07:22 crc kubenswrapper[4775]: I0321 05:07:22.995952 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f0ec15-decb-4879-9410-0962a21f83ae" containerName="keystone-bootstrap" Mar 21 05:07:22 crc kubenswrapper[4775]: I0321 05:07:22.996563 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.001650 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.001862 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.002002 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nhwlw" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.003443 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.007507 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.013569 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jxmtf"] Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.108372 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx7pf\" (UniqueName: \"kubernetes.io/projected/cdf8b79b-6156-4aa5-a769-3a96408745c1-kube-api-access-dx7pf\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.108471 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-config-data\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.108503 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-combined-ca-bundle\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.108550 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-scripts\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.108567 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-fernet-keys\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.108801 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-credential-keys\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.210984 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-scripts\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.211059 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-fernet-keys\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.211664 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-credential-keys\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.211735 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx7pf\" (UniqueName: \"kubernetes.io/projected/cdf8b79b-6156-4aa5-a769-3a96408745c1-kube-api-access-dx7pf\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.211808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-config-data\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.211925 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-combined-ca-bundle\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.216415 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-credential-keys\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.216916 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-scripts\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.217046 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-fernet-keys\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.219794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-config-data\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.221003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-combined-ca-bundle\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.228255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx7pf\" (UniqueName: \"kubernetes.io/projected/cdf8b79b-6156-4aa5-a769-3a96408745c1-kube-api-access-dx7pf\") pod \"keystone-bootstrap-jxmtf\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.318167 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:23 crc kubenswrapper[4775]: I0321 05:07:23.673606 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f0ec15-decb-4879-9410-0962a21f83ae" path="/var/lib/kubelet/pods/50f0ec15-decb-4879-9410-0962a21f83ae/volumes" Mar 21 05:07:23 crc kubenswrapper[4775]: E0321 05:07:23.771483 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 21 05:07:23 crc kubenswrapper[4775]: E0321 05:07:23.771641 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgm22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-dh4pj_openstack(5cd9d81f-8c8a-46f9-9943-42dc0b638bef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:07:23 crc kubenswrapper[4775]: E0321 05:07:23.773035 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-dh4pj" podUID="5cd9d81f-8c8a-46f9-9943-42dc0b638bef" Mar 21 05:07:23 crc kubenswrapper[4775]: E0321 05:07:23.809313 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-dh4pj" podUID="5cd9d81f-8c8a-46f9-9943-42dc0b638bef" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.865680 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.866146 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf9h5c5h5d9h99h559h585h84h5fch7fh584h589h64h5fch695h5f5h65fh5h88h564h645h547hcfhb5hfdh55fh596h8ch649hd4h577h65fh599q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbdnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6c4c5b5499-jnmhl_openstack(e7fbefc1-c1f6-472d-82b4-77c0a44b514f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.870168 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6c4c5b5499-jnmhl" podUID="e7fbefc1-c1f6-472d-82b4-77c0a44b514f" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.870815 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:5ee16af83557284a12c5915ee8d5d1cc2bc45e245e379412416cadf7824031aa: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-barbican-api/blobs/sha256:5ee16af83557284a12c5915ee8d5d1cc2bc45e245e379412416cadf7824031aa\": context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.871029 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9bczb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-9cmq2_openstack(1658991f-7a2b-4ce9-a240-a940385e0b8f): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:5ee16af83557284a12c5915ee8d5d1cc2bc45e245e379412416cadf7824031aa: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-barbican-api/blobs/sha256:5ee16af83557284a12c5915ee8d5d1cc2bc45e245e379412416cadf7824031aa\": context canceled" logger="UnhandledError" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.873022 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:5ee16af83557284a12c5915ee8d5d1cc2bc45e245e379412416cadf7824031aa: Get \\\"https://quay.io/v2/podified-antelope-centos9/openstack-barbican-api/blobs/sha256:5ee16af83557284a12c5915ee8d5d1cc2bc45e245e379412416cadf7824031aa\\\": context canceled\"" pod="openstack/barbican-db-sync-9cmq2" podUID="1658991f-7a2b-4ce9-a240-a940385e0b8f" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.884574 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.884755 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb4h76h645hffh57dh684hc8h59chdh657h565hdhb9h5ch75h575h58h686h67h68bh679h65fh565h8bh5dbh9h5f8h596h555hd9h5b8h694q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tstsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7bc489768c-rlbc8_openstack(44b58fbc-be53-4531-9dee-05ba398f25ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.887025 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7bc489768c-rlbc8" podUID="44b58fbc-be53-4531-9dee-05ba398f25ef" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.926565 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.926722 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-764f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-l8qqc_openstack(37bb7e34-ac47-44f6-b18f-ef4ed78eea6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:07:24 crc kubenswrapper[4775]: E0321 05:07:24.928182 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-l8qqc" podUID="37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.041379 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.042273 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-sb\") pod \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.042310 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-dns-svc\") pod \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.051918 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.141606 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b50afbd-31b8-40ff-bd7b-1ce5021e2837" (UID: "6b50afbd-31b8-40ff-bd7b-1ce5021e2837"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.143709 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltd4s\" (UniqueName: \"kubernetes.io/projected/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-kube-api-access-ltd4s\") pod \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.143754 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-nb\") pod \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.143785 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-config\") pod \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\" (UID: \"6b50afbd-31b8-40ff-bd7b-1ce5021e2837\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.144588 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.145407 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b50afbd-31b8-40ff-bd7b-1ce5021e2837" (UID: "6b50afbd-31b8-40ff-bd7b-1ce5021e2837"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.159901 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-kube-api-access-ltd4s" (OuterVolumeSpecName: "kube-api-access-ltd4s") pod "6b50afbd-31b8-40ff-bd7b-1ce5021e2837" (UID: "6b50afbd-31b8-40ff-bd7b-1ce5021e2837"). InnerVolumeSpecName "kube-api-access-ltd4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.198846 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-config" (OuterVolumeSpecName: "config") pod "6b50afbd-31b8-40ff-bd7b-1ce5021e2837" (UID: "6b50afbd-31b8-40ff-bd7b-1ce5021e2837"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.202631 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b50afbd-31b8-40ff-bd7b-1ce5021e2837" (UID: "6b50afbd-31b8-40ff-bd7b-1ce5021e2837"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.245929 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281a5f0-270c-4d95-93b1-ad265786b886-logs\") pod \"f281a5f0-270c-4d95-93b1-ad265786b886\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.245998 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-scripts\") pod \"f281a5f0-270c-4d95-93b1-ad265786b886\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.246093 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f281a5f0-270c-4d95-93b1-ad265786b886-horizon-secret-key\") pod \"f281a5f0-270c-4d95-93b1-ad265786b886\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.246150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjzgj\" (UniqueName: \"kubernetes.io/projected/f281a5f0-270c-4d95-93b1-ad265786b886-kube-api-access-jjzgj\") pod \"f281a5f0-270c-4d95-93b1-ad265786b886\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.246252 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-config-data\") pod \"f281a5f0-270c-4d95-93b1-ad265786b886\" (UID: \"f281a5f0-270c-4d95-93b1-ad265786b886\") " Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.246336 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f281a5f0-270c-4d95-93b1-ad265786b886-logs" (OuterVolumeSpecName: "logs") pod "f281a5f0-270c-4d95-93b1-ad265786b886" (UID: "f281a5f0-270c-4d95-93b1-ad265786b886"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.246554 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-scripts" (OuterVolumeSpecName: "scripts") pod "f281a5f0-270c-4d95-93b1-ad265786b886" (UID: "f281a5f0-270c-4d95-93b1-ad265786b886"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.246902 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-config-data" (OuterVolumeSpecName: "config-data") pod "f281a5f0-270c-4d95-93b1-ad265786b886" (UID: "f281a5f0-270c-4d95-93b1-ad265786b886"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.246993 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f281a5f0-270c-4d95-93b1-ad265786b886-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.247018 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.247033 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltd4s\" (UniqueName: \"kubernetes.io/projected/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-kube-api-access-ltd4s\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.247051 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.247065 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.247078 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b50afbd-31b8-40ff-bd7b-1ce5021e2837-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.249523 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f281a5f0-270c-4d95-93b1-ad265786b886-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f281a5f0-270c-4d95-93b1-ad265786b886" (UID: "f281a5f0-270c-4d95-93b1-ad265786b886"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.250877 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f281a5f0-270c-4d95-93b1-ad265786b886-kube-api-access-jjzgj" (OuterVolumeSpecName: "kube-api-access-jjzgj") pod "f281a5f0-270c-4d95-93b1-ad265786b886" (UID: "f281a5f0-270c-4d95-93b1-ad265786b886"). InnerVolumeSpecName "kube-api-access-jjzgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.349344 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjzgj\" (UniqueName: \"kubernetes.io/projected/f281a5f0-270c-4d95-93b1-ad265786b886-kube-api-access-jjzgj\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.349382 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f281a5f0-270c-4d95-93b1-ad265786b886-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.349393 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f281a5f0-270c-4d95-93b1-ad265786b886-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.381456 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6496ddbdd4-v5mc5"] Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.762361 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64fb567758-hd2ld"] Mar 21 05:07:25 crc kubenswrapper[4775]: W0321 05:07:25.763181 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09cf3763_0b41_4452_a247_d9a56f58b05d.slice/crio-7045bd948f55810ab12433655a5f2808bafab5e3e89d9d68cb00186c737c1405 WatchSource:0}: Error finding container 7045bd948f55810ab12433655a5f2808bafab5e3e89d9d68cb00186c737c1405: Status 404 returned error can't find the container with id 7045bd948f55810ab12433655a5f2808bafab5e3e89d9d68cb00186c737c1405 Mar 21 05:07:25 crc kubenswrapper[4775]: W0321 05:07:25.820535 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdf8b79b_6156_4aa5_a769_3a96408745c1.slice/crio-468779f2f3697fbd3ede621a392a75e5d47b2fbe74df31f239ef36748cd50133 WatchSource:0}: Error finding container 468779f2f3697fbd3ede621a392a75e5d47b2fbe74df31f239ef36748cd50133: Status 404 returned error can't find the container with id 468779f2f3697fbd3ede621a392a75e5d47b2fbe74df31f239ef36748cd50133 Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.824673 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jxmtf"] Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.830606 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" event={"ID":"6b50afbd-31b8-40ff-bd7b-1ce5021e2837","Type":"ContainerDied","Data":"2e772d1abd813f0f52a871bbd2aa2afc7b674d5369d7a3240dceb79e9ea9149f"} Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.830672 4775 scope.go:117] "RemoveContainer" containerID="6c721bd1cf19d6ad1fc2eb055e74ccf9afea13852ad17eeb02e87ea598c42465" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.830704 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.836738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerStarted","Data":"4e1e7c557d6fa58625eeced787347009a65ed5b8a3da8a6487f343ffff4e6c90"} Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.841262 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64fb567758-hd2ld" event={"ID":"09cf3763-0b41-4452-a247-d9a56f58b05d","Type":"ContainerStarted","Data":"7045bd948f55810ab12433655a5f2808bafab5e3e89d9d68cb00186c737c1405"} Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.842366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6496ddbdd4-v5mc5" event={"ID":"fc6e433f-9e70-4b09-9780-403634bbe0dc","Type":"ContainerStarted","Data":"def71a90dc0cab6d68037e66af17622e205fe9b15e3446e1e10bd30605f49207"} Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.852717 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9fd475b9c-m48br" event={"ID":"f281a5f0-270c-4d95-93b1-ad265786b886","Type":"ContainerDied","Data":"3ed2947533338d4f3898375bd2e8d20bf0cdf56e90ab25944e2ce0199f079b4a"} Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.852898 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9fd475b9c-m48br" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.862959 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-b4m98"] Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.872163 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-b4m98"] Mar 21 05:07:25 crc kubenswrapper[4775]: E0321 05:07:25.929096 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-l8qqc" podUID="37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" Mar 21 05:07:25 crc kubenswrapper[4775]: E0321 05:07:25.929417 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-9cmq2" podUID="1658991f-7a2b-4ce9-a240-a940385e0b8f" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.929487 4775 scope.go:117] "RemoveContainer" containerID="5d8a4462c4906d47c3863ff44ffe3802ba03cf186cd035ba4999f12c54eb2a8b" Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.964224 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9fd475b9c-m48br"] Mar 21 05:07:25 crc kubenswrapper[4775]: I0321 05:07:25.973148 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9fd475b9c-m48br"] Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.096823 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-b4m98" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.255646 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.264330 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbdnp\" (UniqueName: \"kubernetes.io/projected/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-kube-api-access-rbdnp\") pod \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.264778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-logs\") pod \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.264816 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-scripts\") pod \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.264886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-horizon-secret-key\") pod \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.264943 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-config-data\") pod \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\" (UID: \"e7fbefc1-c1f6-472d-82b4-77c0a44b514f\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.265199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-logs" (OuterVolumeSpecName: "logs") pod "e7fbefc1-c1f6-472d-82b4-77c0a44b514f" (UID: "e7fbefc1-c1f6-472d-82b4-77c0a44b514f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.265735 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.266808 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-config-data" (OuterVolumeSpecName: "config-data") pod "e7fbefc1-c1f6-472d-82b4-77c0a44b514f" (UID: "e7fbefc1-c1f6-472d-82b4-77c0a44b514f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.267489 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-scripts" (OuterVolumeSpecName: "scripts") pod "e7fbefc1-c1f6-472d-82b4-77c0a44b514f" (UID: "e7fbefc1-c1f6-472d-82b4-77c0a44b514f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.270353 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-kube-api-access-rbdnp" (OuterVolumeSpecName: "kube-api-access-rbdnp") pod "e7fbefc1-c1f6-472d-82b4-77c0a44b514f" (UID: "e7fbefc1-c1f6-472d-82b4-77c0a44b514f"). InnerVolumeSpecName "kube-api-access-rbdnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.276467 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e7fbefc1-c1f6-472d-82b4-77c0a44b514f" (UID: "e7fbefc1-c1f6-472d-82b4-77c0a44b514f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.277529 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.368542 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbdnp\" (UniqueName: \"kubernetes.io/projected/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-kube-api-access-rbdnp\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.368569 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.368581 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.368591 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e7fbefc1-c1f6-472d-82b4-77c0a44b514f-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.470780 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tstsp\" (UniqueName: \"kubernetes.io/projected/44b58fbc-be53-4531-9dee-05ba398f25ef-kube-api-access-tstsp\") pod \"44b58fbc-be53-4531-9dee-05ba398f25ef\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.470860 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-scripts\") pod \"44b58fbc-be53-4531-9dee-05ba398f25ef\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.470893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-config-data\") pod \"44b58fbc-be53-4531-9dee-05ba398f25ef\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.470940 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44b58fbc-be53-4531-9dee-05ba398f25ef-horizon-secret-key\") pod \"44b58fbc-be53-4531-9dee-05ba398f25ef\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.470979 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b58fbc-be53-4531-9dee-05ba398f25ef-logs\") pod \"44b58fbc-be53-4531-9dee-05ba398f25ef\" (UID: \"44b58fbc-be53-4531-9dee-05ba398f25ef\") " Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.471782 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b58fbc-be53-4531-9dee-05ba398f25ef-logs" (OuterVolumeSpecName: "logs") pod "44b58fbc-be53-4531-9dee-05ba398f25ef" (UID: "44b58fbc-be53-4531-9dee-05ba398f25ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.471868 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-scripts" (OuterVolumeSpecName: "scripts") pod "44b58fbc-be53-4531-9dee-05ba398f25ef" (UID: "44b58fbc-be53-4531-9dee-05ba398f25ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.472474 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-config-data" (OuterVolumeSpecName: "config-data") pod "44b58fbc-be53-4531-9dee-05ba398f25ef" (UID: "44b58fbc-be53-4531-9dee-05ba398f25ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.475157 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b58fbc-be53-4531-9dee-05ba398f25ef-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "44b58fbc-be53-4531-9dee-05ba398f25ef" (UID: "44b58fbc-be53-4531-9dee-05ba398f25ef"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.476251 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b58fbc-be53-4531-9dee-05ba398f25ef-kube-api-access-tstsp" (OuterVolumeSpecName: "kube-api-access-tstsp") pod "44b58fbc-be53-4531-9dee-05ba398f25ef" (UID: "44b58fbc-be53-4531-9dee-05ba398f25ef"). InnerVolumeSpecName "kube-api-access-tstsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.573583 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tstsp\" (UniqueName: \"kubernetes.io/projected/44b58fbc-be53-4531-9dee-05ba398f25ef-kube-api-access-tstsp\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.573950 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.573966 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/44b58fbc-be53-4531-9dee-05ba398f25ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.573978 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/44b58fbc-be53-4531-9dee-05ba398f25ef-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.573988 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44b58fbc-be53-4531-9dee-05ba398f25ef-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.906035 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6496ddbdd4-v5mc5" event={"ID":"fc6e433f-9e70-4b09-9780-403634bbe0dc","Type":"ContainerStarted","Data":"8b8b8699c4dde7309a59e872dc53db96f0f40ebcc3885606c855e96185e66b67"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.906083 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6496ddbdd4-v5mc5" event={"ID":"fc6e433f-9e70-4b09-9780-403634bbe0dc","Type":"ContainerStarted","Data":"3f539614cc2828bc92642f654cb791c2699bf6b0605023f3c2de3117a0947231"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.910213 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c4c5b5499-jnmhl" event={"ID":"e7fbefc1-c1f6-472d-82b4-77c0a44b514f","Type":"ContainerDied","Data":"4d928750acf27a12d43a71655b2e716129e9bfb358dbdddb74f0ae57f062cbba"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.910216 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c4c5b5499-jnmhl" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.916465 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bc489768c-rlbc8" event={"ID":"44b58fbc-be53-4531-9dee-05ba398f25ef","Type":"ContainerDied","Data":"e60f7afbe17b7aa1bbfaf8e491a43aeb53519de12ebcef7dcaf29708ec522108"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.916556 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bc489768c-rlbc8" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.924025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jxmtf" event={"ID":"cdf8b79b-6156-4aa5-a769-3a96408745c1","Type":"ContainerStarted","Data":"e72ab47d319ea50893e1c6b3a16586fa25056fa334f81da8dd2632328d94d338"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.924082 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jxmtf" event={"ID":"cdf8b79b-6156-4aa5-a769-3a96408745c1","Type":"ContainerStarted","Data":"468779f2f3697fbd3ede621a392a75e5d47b2fbe74df31f239ef36748cd50133"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.931198 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6496ddbdd4-v5mc5" podStartSLOduration=32.386875967 podStartE2EDuration="32.931180539s" podCreationTimestamp="2026-03-21 05:06:54 +0000 UTC" firstStartedPulling="2026-03-21 05:07:25.412782158 +0000 UTC m=+1198.389245782" lastFinishedPulling="2026-03-21 05:07:25.95708673 +0000 UTC m=+1198.933550354" observedRunningTime="2026-03-21 05:07:26.929691587 +0000 UTC m=+1199.906155231" watchObservedRunningTime="2026-03-21 05:07:26.931180539 +0000 UTC m=+1199.907644163" Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.931860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64fb567758-hd2ld" event={"ID":"09cf3763-0b41-4452-a247-d9a56f58b05d","Type":"ContainerStarted","Data":"1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.931912 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64fb567758-hd2ld" event={"ID":"09cf3763-0b41-4452-a247-d9a56f58b05d","Type":"ContainerStarted","Data":"4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd"} Mar 21 05:07:26 crc kubenswrapper[4775]: I0321 05:07:26.966716 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jxmtf" podStartSLOduration=4.966690433 podStartE2EDuration="4.966690433s" podCreationTimestamp="2026-03-21 05:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:26.951143993 +0000 UTC m=+1199.927607627" watchObservedRunningTime="2026-03-21 05:07:26.966690433 +0000 UTC m=+1199.943154057" Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.005363 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bc489768c-rlbc8"] Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.023850 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bc489768c-rlbc8"] Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.053902 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c4c5b5499-jnmhl"] Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.061537 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c4c5b5499-jnmhl"] Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.066767 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64fb567758-hd2ld" podStartSLOduration=32.615005353 podStartE2EDuration="33.06674658s" podCreationTimestamp="2026-03-21 05:06:54 +0000 UTC" firstStartedPulling="2026-03-21 05:07:25.765695511 +0000 UTC m=+1198.742159135" lastFinishedPulling="2026-03-21 05:07:26.217436738 +0000 UTC m=+1199.193900362" observedRunningTime="2026-03-21 05:07:27.033929313 +0000 UTC m=+1200.010392957" watchObservedRunningTime="2026-03-21 05:07:27.06674658 +0000 UTC m=+1200.043210204" Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.676934 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b58fbc-be53-4531-9dee-05ba398f25ef" path="/var/lib/kubelet/pods/44b58fbc-be53-4531-9dee-05ba398f25ef/volumes" Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.677671 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" path="/var/lib/kubelet/pods/6b50afbd-31b8-40ff-bd7b-1ce5021e2837/volumes" Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.678338 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7fbefc1-c1f6-472d-82b4-77c0a44b514f" path="/var/lib/kubelet/pods/e7fbefc1-c1f6-472d-82b4-77c0a44b514f/volumes" Mar 21 05:07:27 crc kubenswrapper[4775]: I0321 05:07:27.678754 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f281a5f0-270c-4d95-93b1-ad265786b886" path="/var/lib/kubelet/pods/f281a5f0-270c-4d95-93b1-ad265786b886/volumes" Mar 21 05:07:31 crc kubenswrapper[4775]: I0321 05:07:31.984173 4775 generic.go:334] "Generic (PLEG): container finished" podID="716605f1-5111-4e7a-9591-18dfb5da1984" containerID="43def41a0cf1f318d43db6319e03e84d1a862d118a43d560e7c5e1d51f26dead" exitCode=0 Mar 21 05:07:31 crc kubenswrapper[4775]: I0321 05:07:31.984300 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-td76j" event={"ID":"716605f1-5111-4e7a-9591-18dfb5da1984","Type":"ContainerDied","Data":"43def41a0cf1f318d43db6319e03e84d1a862d118a43d560e7c5e1d51f26dead"} Mar 21 05:07:32 crc kubenswrapper[4775]: I0321 05:07:32.482768 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:07:32 crc kubenswrapper[4775]: I0321 05:07:32.482826 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:07:32 crc kubenswrapper[4775]: I0321 05:07:32.995482 4775 generic.go:334] "Generic (PLEG): container finished" podID="cdf8b79b-6156-4aa5-a769-3a96408745c1" containerID="e72ab47d319ea50893e1c6b3a16586fa25056fa334f81da8dd2632328d94d338" exitCode=0 Mar 21 05:07:32 crc kubenswrapper[4775]: I0321 05:07:32.995557 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jxmtf" event={"ID":"cdf8b79b-6156-4aa5-a769-3a96408745c1","Type":"ContainerDied","Data":"e72ab47d319ea50893e1c6b3a16586fa25056fa334f81da8dd2632328d94d338"} Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.701187 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-td76j" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.817429 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-db-sync-config-data\") pod \"716605f1-5111-4e7a-9591-18dfb5da1984\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.817592 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-combined-ca-bundle\") pod \"716605f1-5111-4e7a-9591-18dfb5da1984\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.817647 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-config-data\") pod \"716605f1-5111-4e7a-9591-18dfb5da1984\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.817723 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/716605f1-5111-4e7a-9591-18dfb5da1984-kube-api-access-zmggl\") pod \"716605f1-5111-4e7a-9591-18dfb5da1984\" (UID: \"716605f1-5111-4e7a-9591-18dfb5da1984\") " Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.823256 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "716605f1-5111-4e7a-9591-18dfb5da1984" (UID: "716605f1-5111-4e7a-9591-18dfb5da1984"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.823586 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716605f1-5111-4e7a-9591-18dfb5da1984-kube-api-access-zmggl" (OuterVolumeSpecName: "kube-api-access-zmggl") pod "716605f1-5111-4e7a-9591-18dfb5da1984" (UID: "716605f1-5111-4e7a-9591-18dfb5da1984"). InnerVolumeSpecName "kube-api-access-zmggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.840537 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "716605f1-5111-4e7a-9591-18dfb5da1984" (UID: "716605f1-5111-4e7a-9591-18dfb5da1984"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.857338 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-config-data" (OuterVolumeSpecName: "config-data") pod "716605f1-5111-4e7a-9591-18dfb5da1984" (UID: "716605f1-5111-4e7a-9591-18dfb5da1984"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.920300 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.920332 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.920344 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716605f1-5111-4e7a-9591-18dfb5da1984-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:33 crc kubenswrapper[4775]: I0321 05:07:33.920352 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmggl\" (UniqueName: \"kubernetes.io/projected/716605f1-5111-4e7a-9591-18dfb5da1984-kube-api-access-zmggl\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.022829 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerStarted","Data":"3ffce781b6baa0a2f3ae718a0cd398633d05ff48719dba863adc5152b47fe849"} Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.025055 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-td76j" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.025190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-td76j" event={"ID":"716605f1-5111-4e7a-9591-18dfb5da1984","Type":"ContainerDied","Data":"c9c92c1914b54290237327698e3251a34ba7a8a08da38fd6bd3a1901ce249b25"} Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.027404 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c92c1914b54290237327698e3251a34ba7a8a08da38fd6bd3a1901ce249b25" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.380356 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-bvxtx"] Mar 21 05:07:34 crc kubenswrapper[4775]: E0321 05:07:34.380683 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="init" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.380697 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="init" Mar 21 05:07:34 crc kubenswrapper[4775]: E0321 05:07:34.380718 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716605f1-5111-4e7a-9591-18dfb5da1984" containerName="glance-db-sync" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.380724 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="716605f1-5111-4e7a-9591-18dfb5da1984" containerName="glance-db-sync" Mar 21 05:07:34 crc kubenswrapper[4775]: E0321 05:07:34.380737 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.380743 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.380899 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="716605f1-5111-4e7a-9591-18dfb5da1984" containerName="glance-db-sync" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.380911 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b50afbd-31b8-40ff-bd7b-1ce5021e2837" containerName="dnsmasq-dns" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.381851 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.405383 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-bvxtx"] Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.490278 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.531061 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.531143 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.531286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.531342 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.531367 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-config\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.531550 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vtjl\" (UniqueName: \"kubernetes.io/projected/f65ac8a3-4328-41af-8854-e8a2a9fce295-kube-api-access-6vtjl\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.632740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-fernet-keys\") pod \"cdf8b79b-6156-4aa5-a769-3a96408745c1\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.632798 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx7pf\" (UniqueName: \"kubernetes.io/projected/cdf8b79b-6156-4aa5-a769-3a96408745c1-kube-api-access-dx7pf\") pod \"cdf8b79b-6156-4aa5-a769-3a96408745c1\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.632834 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-scripts\") pod \"cdf8b79b-6156-4aa5-a769-3a96408745c1\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.632860 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-combined-ca-bundle\") pod \"cdf8b79b-6156-4aa5-a769-3a96408745c1\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.632944 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-config-data\") pod \"cdf8b79b-6156-4aa5-a769-3a96408745c1\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.633060 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-credential-keys\") pod \"cdf8b79b-6156-4aa5-a769-3a96408745c1\" (UID: \"cdf8b79b-6156-4aa5-a769-3a96408745c1\") " Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.633251 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.633322 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.633351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.633368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-config\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.633404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vtjl\" (UniqueName: \"kubernetes.io/projected/f65ac8a3-4328-41af-8854-e8a2a9fce295-kube-api-access-6vtjl\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.633443 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.634605 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-config\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.634613 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.635579 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.635801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.635988 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.641313 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-scripts" (OuterVolumeSpecName: "scripts") pod "cdf8b79b-6156-4aa5-a769-3a96408745c1" (UID: "cdf8b79b-6156-4aa5-a769-3a96408745c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.641370 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cdf8b79b-6156-4aa5-a769-3a96408745c1" (UID: "cdf8b79b-6156-4aa5-a769-3a96408745c1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.644347 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf8b79b-6156-4aa5-a769-3a96408745c1-kube-api-access-dx7pf" (OuterVolumeSpecName: "kube-api-access-dx7pf") pod "cdf8b79b-6156-4aa5-a769-3a96408745c1" (UID: "cdf8b79b-6156-4aa5-a769-3a96408745c1"). InnerVolumeSpecName "kube-api-access-dx7pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.655224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cdf8b79b-6156-4aa5-a769-3a96408745c1" (UID: "cdf8b79b-6156-4aa5-a769-3a96408745c1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.660840 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vtjl\" (UniqueName: \"kubernetes.io/projected/f65ac8a3-4328-41af-8854-e8a2a9fce295-kube-api-access-6vtjl\") pod \"dnsmasq-dns-56df8fb6b7-bvxtx\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.686174 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdf8b79b-6156-4aa5-a769-3a96408745c1" (UID: "cdf8b79b-6156-4aa5-a769-3a96408745c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.712672 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-config-data" (OuterVolumeSpecName: "config-data") pod "cdf8b79b-6156-4aa5-a769-3a96408745c1" (UID: "cdf8b79b-6156-4aa5-a769-3a96408745c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.734651 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.734682 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.734692 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.734702 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx7pf\" (UniqueName: \"kubernetes.io/projected/cdf8b79b-6156-4aa5-a769-3a96408745c1-kube-api-access-dx7pf\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.734711 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.734725 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf8b79b-6156-4aa5-a769-3a96408745c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.775419 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.979198 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:07:34 crc kubenswrapper[4775]: I0321 05:07:34.979494 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.032397 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.032444 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.033875 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jxmtf" event={"ID":"cdf8b79b-6156-4aa5-a769-3a96408745c1","Type":"ContainerDied","Data":"468779f2f3697fbd3ede621a392a75e5d47b2fbe74df31f239ef36748cd50133"} Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.033906 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468779f2f3697fbd3ede621a392a75e5d47b2fbe74df31f239ef36748cd50133" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.033937 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jxmtf" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.199405 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-66c879cfdd-smnxp"] Mar 21 05:07:35 crc kubenswrapper[4775]: E0321 05:07:35.199832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf8b79b-6156-4aa5-a769-3a96408745c1" containerName="keystone-bootstrap" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.199855 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf8b79b-6156-4aa5-a769-3a96408745c1" containerName="keystone-bootstrap" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.200094 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf8b79b-6156-4aa5-a769-3a96408745c1" containerName="keystone-bootstrap" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.200776 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.203324 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.204015 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.204033 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.204054 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nhwlw" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.204249 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.204554 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.219330 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66c879cfdd-smnxp"] Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.230143 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-bvxtx"] Mar 21 05:07:35 crc kubenswrapper[4775]: W0321 05:07:35.237991 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf65ac8a3_4328_41af_8854_e8a2a9fce295.slice/crio-b2b7a3537eba6e1449d61c2042541c00aac4345e771c671fe8c458e056f81a53 WatchSource:0}: Error finding container b2b7a3537eba6e1449d61c2042541c00aac4345e771c671fe8c458e056f81a53: Status 404 returned error can't find the container with id b2b7a3537eba6e1449d61c2042541c00aac4345e771c671fe8c458e056f81a53 Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.279074 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.280843 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.286600 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.286790 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4x9cq" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.287057 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.288558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348180 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-credential-keys\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348560 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-scripts\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348596 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-public-tls-certs\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348632 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-combined-ca-bundle\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348654 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-config-data\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348719 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-fernet-keys\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjr9h\" (UniqueName: \"kubernetes.io/projected/9f965feb-5d82-4176-a14d-08a84c4ae794-kube-api-access-zjr9h\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.348777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-internal-tls-certs\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.450619 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.450679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-public-tls-certs\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.450786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-logs\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.450833 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.450887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96cd\" (UniqueName: \"kubernetes.io/projected/3627b61d-e88a-4523-8e6d-8a45a7e626c1-kube-api-access-f96cd\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.450963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-combined-ca-bundle\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.450994 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-config-data\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-fernet-keys\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451084 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjr9h\" (UniqueName: \"kubernetes.io/projected/9f965feb-5d82-4176-a14d-08a84c4ae794-kube-api-access-zjr9h\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451111 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451235 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-internal-tls-certs\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451323 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-credential-keys\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451485 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.451520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-scripts\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.462389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-config-data\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.462718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-scripts\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.462809 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-internal-tls-certs\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.463313 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-public-tls-certs\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.463314 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-credential-keys\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.468291 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-fernet-keys\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.468618 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f965feb-5d82-4176-a14d-08a84c4ae794-combined-ca-bundle\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.477553 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjr9h\" (UniqueName: \"kubernetes.io/projected/9f965feb-5d82-4176-a14d-08a84c4ae794-kube-api-access-zjr9h\") pod \"keystone-66c879cfdd-smnxp\" (UID: \"9f965feb-5d82-4176-a14d-08a84c4ae794\") " pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.515013 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553324 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553392 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-logs\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553496 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f96cd\" (UniqueName: \"kubernetes.io/projected/3627b61d-e88a-4523-8e6d-8a45a7e626c1-kube-api-access-f96cd\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553578 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553593 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.553882 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.555067 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-logs\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.559390 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.560713 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.578785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96cd\" (UniqueName: \"kubernetes.io/projected/3627b61d-e88a-4523-8e6d-8a45a7e626c1-kube-api-access-f96cd\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.587013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.592040 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.637647 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.640976 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.649660 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.649855 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.757181 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.757528 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.757561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8zc\" (UniqueName: \"kubernetes.io/projected/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-kube-api-access-hd8zc\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.757666 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.757715 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-logs\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.757754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.757776 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.859388 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.859432 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.859451 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8zc\" (UniqueName: \"kubernetes.io/projected/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-kube-api-access-hd8zc\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.859525 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.859558 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-logs\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.859585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.859599 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.860290 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.860296 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-logs\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.861034 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.876271 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.876950 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.877078 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.884859 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8zc\" (UniqueName: \"kubernetes.io/projected/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-kube-api-access-hd8zc\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.903459 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.908028 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66c879cfdd-smnxp"] Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.949217 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:35 crc kubenswrapper[4775]: I0321 05:07:35.989134 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.053947 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh4pj" event={"ID":"5cd9d81f-8c8a-46f9-9943-42dc0b638bef","Type":"ContainerStarted","Data":"1a60b462497324b6ca2e30820dbc175100c3b70b7e0a4ec6225ceb18cce50eaa"} Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.059544 4775 generic.go:334] "Generic (PLEG): container finished" podID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerID="73687f66a6e199eea3343a248d8eb3f006e4d6f0bdcb64641db9e3b7a2d85758" exitCode=0 Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.059632 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" event={"ID":"f65ac8a3-4328-41af-8854-e8a2a9fce295","Type":"ContainerDied","Data":"73687f66a6e199eea3343a248d8eb3f006e4d6f0bdcb64641db9e3b7a2d85758"} Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.059681 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" event={"ID":"f65ac8a3-4328-41af-8854-e8a2a9fce295","Type":"ContainerStarted","Data":"b2b7a3537eba6e1449d61c2042541c00aac4345e771c671fe8c458e056f81a53"} Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.066072 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c879cfdd-smnxp" event={"ID":"9f965feb-5d82-4176-a14d-08a84c4ae794","Type":"ContainerStarted","Data":"76ea5f3d3bd164fcd0cd09e31995b32bf21fe42671038c6ea3df00f04fb44076"} Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.138817 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dh4pj" podStartSLOduration=2.500218996 podStartE2EDuration="50.138799661s" podCreationTimestamp="2026-03-21 05:06:46 +0000 UTC" firstStartedPulling="2026-03-21 05:06:47.448641689 +0000 UTC m=+1160.425105313" lastFinishedPulling="2026-03-21 05:07:35.087222354 +0000 UTC m=+1208.063685978" observedRunningTime="2026-03-21 05:07:36.080612997 +0000 UTC m=+1209.057076621" watchObservedRunningTime="2026-03-21 05:07:36.138799661 +0000 UTC m=+1209.115263285" Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.508004 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:36 crc kubenswrapper[4775]: I0321 05:07:36.598368 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:37 crc kubenswrapper[4775]: W0321 05:07:37.020509 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3627b61d_e88a_4523_8e6d_8a45a7e626c1.slice/crio-7327479ccc08ab30546cabbf16e6c59549cedbb1e41f1a39cfe62e456a6c97e2 WatchSource:0}: Error finding container 7327479ccc08ab30546cabbf16e6c59549cedbb1e41f1a39cfe62e456a6c97e2: Status 404 returned error can't find the container with id 7327479ccc08ab30546cabbf16e6c59549cedbb1e41f1a39cfe62e456a6c97e2 Mar 21 05:07:37 crc kubenswrapper[4775]: W0321 05:07:37.023370 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c6e8e3_4fb9_472b_bbf1_2bdcd372bf34.slice/crio-84eb5f4cda9bda7b5808a7ebed1bb7d2eb4953a40585830ab493ee083b6e0bcf WatchSource:0}: Error finding container 84eb5f4cda9bda7b5808a7ebed1bb7d2eb4953a40585830ab493ee083b6e0bcf: Status 404 returned error can't find the container with id 84eb5f4cda9bda7b5808a7ebed1bb7d2eb4953a40585830ab493ee083b6e0bcf Mar 21 05:07:37 crc kubenswrapper[4775]: I0321 05:07:37.076235 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3627b61d-e88a-4523-8e6d-8a45a7e626c1","Type":"ContainerStarted","Data":"7327479ccc08ab30546cabbf16e6c59549cedbb1e41f1a39cfe62e456a6c97e2"} Mar 21 05:07:37 crc kubenswrapper[4775]: I0321 05:07:37.076982 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34","Type":"ContainerStarted","Data":"84eb5f4cda9bda7b5808a7ebed1bb7d2eb4953a40585830ab493ee083b6e0bcf"} Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.017719 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.103109 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3627b61d-e88a-4523-8e6d-8a45a7e626c1","Type":"ContainerStarted","Data":"b0cd245fc2b0c70eaf76803ab7a80bbc622111a54e050582ef48868a191f3421"} Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.106482 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c879cfdd-smnxp" event={"ID":"9f965feb-5d82-4176-a14d-08a84c4ae794","Type":"ContainerStarted","Data":"9e7c3bbccec13922e0bf4822134f7312831fdf48ec6372a3297f63367a30adc2"} Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.109104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.115652 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34","Type":"ContainerStarted","Data":"32f21565b6fc51221eac38731ca38e1d9609cbf9ce961697af6d2d9ccdf3f74f"} Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.123907 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.138037 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-66c879cfdd-smnxp" podStartSLOduration=3.138017901 podStartE2EDuration="3.138017901s" podCreationTimestamp="2026-03-21 05:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:38.13129087 +0000 UTC m=+1211.107754504" watchObservedRunningTime="2026-03-21 05:07:38.138017901 +0000 UTC m=+1211.114481525" Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.155160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" event={"ID":"f65ac8a3-4328-41af-8854-e8a2a9fce295","Type":"ContainerStarted","Data":"18c26c2a6ec8c71c7881c713a6d2fb65eaa2258d47fd0f272a51e57881466532"} Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.155795 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:38 crc kubenswrapper[4775]: I0321 05:07:38.202079 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" podStartSLOduration=4.20204844 podStartE2EDuration="4.20204844s" podCreationTimestamp="2026-03-21 05:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:38.172423943 +0000 UTC m=+1211.148887577" watchObservedRunningTime="2026-03-21 05:07:38.20204844 +0000 UTC m=+1211.178512064" Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.165350 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3627b61d-e88a-4523-8e6d-8a45a7e626c1","Type":"ContainerStarted","Data":"a8cd516d29617897d19857166b7b9b4d04fb227e81a99f9300ea48c6eed5c3c4"} Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.165677 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-log" containerID="cri-o://b0cd245fc2b0c70eaf76803ab7a80bbc622111a54e050582ef48868a191f3421" gracePeriod=30 Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.165956 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-httpd" containerID="cri-o://a8cd516d29617897d19857166b7b9b4d04fb227e81a99f9300ea48c6eed5c3c4" gracePeriod=30 Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.168785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34","Type":"ContainerStarted","Data":"1d80db6f5c73575575e757ede386b167af7f87a0b631372385a77aa72299bb8b"} Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.168947 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-log" containerID="cri-o://32f21565b6fc51221eac38731ca38e1d9609cbf9ce961697af6d2d9ccdf3f74f" gracePeriod=30 Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.169009 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-httpd" containerID="cri-o://1d80db6f5c73575575e757ede386b167af7f87a0b631372385a77aa72299bb8b" gracePeriod=30 Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.192489 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.192474039 podStartE2EDuration="5.192474039s" podCreationTimestamp="2026-03-21 05:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:39.191037689 +0000 UTC m=+1212.167501313" watchObservedRunningTime="2026-03-21 05:07:39.192474039 +0000 UTC m=+1212.168937663" Mar 21 05:07:39 crc kubenswrapper[4775]: I0321 05:07:39.218229 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.218207216 podStartE2EDuration="5.218207216s" podCreationTimestamp="2026-03-21 05:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:39.214315737 +0000 UTC m=+1212.190779371" watchObservedRunningTime="2026-03-21 05:07:39.218207216 +0000 UTC m=+1212.194670840" Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.199494 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l8qqc" event={"ID":"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a","Type":"ContainerStarted","Data":"84b76b926432d603dd6d10be15e6cfc9e20ac65787b9fb2a30c3bdbd38a96e09"} Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.205178 4775 generic.go:334] "Generic (PLEG): container finished" podID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerID="a8cd516d29617897d19857166b7b9b4d04fb227e81a99f9300ea48c6eed5c3c4" exitCode=0 Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.205215 4775 generic.go:334] "Generic (PLEG): container finished" podID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerID="b0cd245fc2b0c70eaf76803ab7a80bbc622111a54e050582ef48868a191f3421" exitCode=143 Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.205250 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3627b61d-e88a-4523-8e6d-8a45a7e626c1","Type":"ContainerDied","Data":"a8cd516d29617897d19857166b7b9b4d04fb227e81a99f9300ea48c6eed5c3c4"} Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.205293 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3627b61d-e88a-4523-8e6d-8a45a7e626c1","Type":"ContainerDied","Data":"b0cd245fc2b0c70eaf76803ab7a80bbc622111a54e050582ef48868a191f3421"} Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.208782 4775 generic.go:334] "Generic (PLEG): container finished" podID="9531f5f9-8f77-4882-b779-4210b1de81de" containerID="f19a3eb592d811463bd26fefd306c16b1ec6187913716e5585ce5b02489e2c0c" exitCode=0 Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.208984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wvjlb" event={"ID":"9531f5f9-8f77-4882-b779-4210b1de81de","Type":"ContainerDied","Data":"f19a3eb592d811463bd26fefd306c16b1ec6187913716e5585ce5b02489e2c0c"} Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.217821 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-l8qqc" podStartSLOduration=4.012702732 podStartE2EDuration="55.217799636s" podCreationTimestamp="2026-03-21 05:06:45 +0000 UTC" firstStartedPulling="2026-03-21 05:06:47.033579901 +0000 UTC m=+1160.010043525" lastFinishedPulling="2026-03-21 05:07:38.238676805 +0000 UTC m=+1211.215140429" observedRunningTime="2026-03-21 05:07:40.216304664 +0000 UTC m=+1213.192768298" watchObservedRunningTime="2026-03-21 05:07:40.217799636 +0000 UTC m=+1213.194263260" Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.218723 4775 generic.go:334] "Generic (PLEG): container finished" podID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerID="1d80db6f5c73575575e757ede386b167af7f87a0b631372385a77aa72299bb8b" exitCode=0 Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.218755 4775 generic.go:334] "Generic (PLEG): container finished" podID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerID="32f21565b6fc51221eac38731ca38e1d9609cbf9ce961697af6d2d9ccdf3f74f" exitCode=143 Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.222575 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34","Type":"ContainerDied","Data":"1d80db6f5c73575575e757ede386b167af7f87a0b631372385a77aa72299bb8b"} Mar 21 05:07:40 crc kubenswrapper[4775]: I0321 05:07:40.222630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34","Type":"ContainerDied","Data":"32f21565b6fc51221eac38731ca38e1d9609cbf9ce961697af6d2d9ccdf3f74f"} Mar 21 05:07:41 crc kubenswrapper[4775]: I0321 05:07:41.233276 4775 generic.go:334] "Generic (PLEG): container finished" podID="5cd9d81f-8c8a-46f9-9943-42dc0b638bef" containerID="1a60b462497324b6ca2e30820dbc175100c3b70b7e0a4ec6225ceb18cce50eaa" exitCode=0 Mar 21 05:07:41 crc kubenswrapper[4775]: I0321 05:07:41.233661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh4pj" event={"ID":"5cd9d81f-8c8a-46f9-9943-42dc0b638bef","Type":"ContainerDied","Data":"1a60b462497324b6ca2e30820dbc175100c3b70b7e0a4ec6225ceb18cce50eaa"} Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.779997 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.895944 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-logs\") pod \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.896003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-config-data\") pod \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.896027 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-combined-ca-bundle\") pod \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.896256 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-httpd-run\") pod \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.896335 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-scripts\") pod \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.896396 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f96cd\" (UniqueName: \"kubernetes.io/projected/3627b61d-e88a-4523-8e6d-8a45a7e626c1-kube-api-access-f96cd\") pod \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.896437 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\" (UID: \"3627b61d-e88a-4523-8e6d-8a45a7e626c1\") " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.898162 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-logs" (OuterVolumeSpecName: "logs") pod "3627b61d-e88a-4523-8e6d-8a45a7e626c1" (UID: "3627b61d-e88a-4523-8e6d-8a45a7e626c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.898400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3627b61d-e88a-4523-8e6d-8a45a7e626c1" (UID: "3627b61d-e88a-4523-8e6d-8a45a7e626c1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.906355 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-scripts" (OuterVolumeSpecName: "scripts") pod "3627b61d-e88a-4523-8e6d-8a45a7e626c1" (UID: "3627b61d-e88a-4523-8e6d-8a45a7e626c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.913580 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3627b61d-e88a-4523-8e6d-8a45a7e626c1-kube-api-access-f96cd" (OuterVolumeSpecName: "kube-api-access-f96cd") pod "3627b61d-e88a-4523-8e6d-8a45a7e626c1" (UID: "3627b61d-e88a-4523-8e6d-8a45a7e626c1"). InnerVolumeSpecName "kube-api-access-f96cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.918447 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "3627b61d-e88a-4523-8e6d-8a45a7e626c1" (UID: "3627b61d-e88a-4523-8e6d-8a45a7e626c1"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.937224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3627b61d-e88a-4523-8e6d-8a45a7e626c1" (UID: "3627b61d-e88a-4523-8e6d-8a45a7e626c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.959284 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-config-data" (OuterVolumeSpecName: "config-data") pod "3627b61d-e88a-4523-8e6d-8a45a7e626c1" (UID: "3627b61d-e88a-4523-8e6d-8a45a7e626c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.998111 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.998154 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.998163 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f96cd\" (UniqueName: \"kubernetes.io/projected/3627b61d-e88a-4523-8e6d-8a45a7e626c1-kube-api-access-f96cd\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.998193 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.998203 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3627b61d-e88a-4523-8e6d-8a45a7e626c1-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.998211 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:42 crc kubenswrapper[4775]: I0321 05:07:42.998219 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3627b61d-e88a-4523-8e6d-8a45a7e626c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.032340 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.099678 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.252015 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3627b61d-e88a-4523-8e6d-8a45a7e626c1","Type":"ContainerDied","Data":"7327479ccc08ab30546cabbf16e6c59549cedbb1e41f1a39cfe62e456a6c97e2"} Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.252079 4775 scope.go:117] "RemoveContainer" containerID="a8cd516d29617897d19857166b7b9b4d04fb227e81a99f9300ea48c6eed5c3c4" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.252237 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.291078 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.301661 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.325074 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:43 crc kubenswrapper[4775]: E0321 05:07:43.325471 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-httpd" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.325489 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-httpd" Mar 21 05:07:43 crc kubenswrapper[4775]: E0321 05:07:43.325518 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-log" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.325524 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-log" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.325691 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-httpd" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.325721 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" containerName="glance-log" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.326645 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.331392 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.332526 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.333235 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.407997 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-scripts\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.408109 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.408294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.408339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4spw\" (UniqueName: \"kubernetes.io/projected/47e577d7-e389-4135-b4fb-979bd627eaa9-kube-api-access-d4spw\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.408400 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.408446 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.408504 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-config-data\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.408565 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-logs\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510469 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-scripts\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510527 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510671 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4spw\" (UniqueName: \"kubernetes.io/projected/47e577d7-e389-4135-b4fb-979bd627eaa9-kube-api-access-d4spw\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510743 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-config-data\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.510815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-logs\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.511021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.511321 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.511500 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-logs\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.517174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-scripts\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.517465 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.517528 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-config-data\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.520209 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.538958 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4spw\" (UniqueName: \"kubernetes.io/projected/47e577d7-e389-4135-b4fb-979bd627eaa9-kube-api-access-d4spw\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.542647 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.647043 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:07:43 crc kubenswrapper[4775]: I0321 05:07:43.673202 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3627b61d-e88a-4523-8e6d-8a45a7e626c1" path="/var/lib/kubelet/pods/3627b61d-e88a-4523-8e6d-8a45a7e626c1/volumes" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.513201 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.629231 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-scripts\") pod \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.629337 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd8zc\" (UniqueName: \"kubernetes.io/projected/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-kube-api-access-hd8zc\") pod \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.629371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-config-data\") pod \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.629405 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.629432 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-combined-ca-bundle\") pod \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.629564 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-httpd-run\") pod \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.629602 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-logs\") pod \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\" (UID: \"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34\") " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.630381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" (UID: "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.630415 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-logs" (OuterVolumeSpecName: "logs") pod "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" (UID: "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.633918 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-scripts" (OuterVolumeSpecName: "scripts") pod "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" (UID: "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.635071 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" (UID: "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.635717 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-kube-api-access-hd8zc" (OuterVolumeSpecName: "kube-api-access-hd8zc") pod "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" (UID: "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34"). InnerVolumeSpecName "kube-api-access-hd8zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.667629 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" (UID: "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.685829 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-config-data" (OuterVolumeSpecName: "config-data") pod "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" (UID: "e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.732965 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.733018 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd8zc\" (UniqueName: \"kubernetes.io/projected/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-kube-api-access-hd8zc\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.733033 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.733063 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.733075 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.733087 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.733375 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.752846 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.777783 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.836051 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.843915 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-4nfhc"] Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.844339 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerName="dnsmasq-dns" containerID="cri-o://f2fafd7848327783c5d640051878058b230f83637eec544e85f17df14ee71f99" gracePeriod=10 Mar 21 05:07:44 crc kubenswrapper[4775]: I0321 05:07:44.987445 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.035398 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6496ddbdd4-v5mc5" podUID="fc6e433f-9e70-4b09-9780-403634bbe0dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.273649 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34","Type":"ContainerDied","Data":"84eb5f4cda9bda7b5808a7ebed1bb7d2eb4953a40585830ab493ee083b6e0bcf"} Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.273665 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.277724 4775 generic.go:334] "Generic (PLEG): container finished" podID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerID="f2fafd7848327783c5d640051878058b230f83637eec544e85f17df14ee71f99" exitCode=0 Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.277764 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" event={"ID":"aaad9349-8b1e-4f07-b3c9-36bc4781a386","Type":"ContainerDied","Data":"f2fafd7848327783c5d640051878058b230f83637eec544e85f17df14ee71f99"} Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.340677 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.354484 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.369835 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:45 crc kubenswrapper[4775]: E0321 05:07:45.370418 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-httpd" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.370439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-httpd" Mar 21 05:07:45 crc kubenswrapper[4775]: E0321 05:07:45.370454 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-log" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.370500 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-log" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.370738 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-httpd" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.370760 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" containerName="glance-log" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.371937 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.374932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.374981 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.378550 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447717 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447764 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447845 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447864 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2pk\" (UniqueName: \"kubernetes.io/projected/71255c4f-1e47-4e35-845f-876fff5fd6d4-kube-api-access-pz2pk\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447894 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447953 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.447971 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549695 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549742 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549827 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549845 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2pk\" (UniqueName: \"kubernetes.io/projected/71255c4f-1e47-4e35-845f-876fff5fd6d4-kube-api-access-pz2pk\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549873 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549903 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549928 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.549946 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.550128 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.550876 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.553021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.554936 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.555547 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.555752 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.559732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.572978 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2pk\" (UniqueName: \"kubernetes.io/projected/71255c4f-1e47-4e35-845f-876fff5fd6d4-kube-api-access-pz2pk\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.578354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.672776 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34" path="/var/lib/kubelet/pods/e1c6e8e3-4fb9-472b-bbf1-2bdcd372bf34/volumes" Mar 21 05:07:45 crc kubenswrapper[4775]: I0321 05:07:45.733011 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:46 crc kubenswrapper[4775]: I0321 05:07:46.287210 4775 generic.go:334] "Generic (PLEG): container finished" podID="37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" containerID="84b76b926432d603dd6d10be15e6cfc9e20ac65787b9fb2a30c3bdbd38a96e09" exitCode=0 Mar 21 05:07:46 crc kubenswrapper[4775]: I0321 05:07:46.287293 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l8qqc" event={"ID":"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a","Type":"ContainerDied","Data":"84b76b926432d603dd6d10be15e6cfc9e20ac65787b9fb2a30c3bdbd38a96e09"} Mar 21 05:07:46 crc kubenswrapper[4775]: I0321 05:07:46.700618 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: connect: connection refused" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.604964 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.610664 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh4pj" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.688495 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-scripts\") pod \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.688589 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-combined-ca-bundle\") pod \"9531f5f9-8f77-4882-b779-4210b1de81de\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.688750 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-logs\") pod \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.688895 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-config-data\") pod \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.688932 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-config\") pod \"9531f5f9-8f77-4882-b779-4210b1de81de\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.688973 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf22x\" (UniqueName: \"kubernetes.io/projected/9531f5f9-8f77-4882-b779-4210b1de81de-kube-api-access-gf22x\") pod \"9531f5f9-8f77-4882-b779-4210b1de81de\" (UID: \"9531f5f9-8f77-4882-b779-4210b1de81de\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.689030 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-combined-ca-bundle\") pod \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.689071 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgm22\" (UniqueName: \"kubernetes.io/projected/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-kube-api-access-sgm22\") pod \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\" (UID: \"5cd9d81f-8c8a-46f9-9943-42dc0b638bef\") " Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.689565 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-logs" (OuterVolumeSpecName: "logs") pod "5cd9d81f-8c8a-46f9-9943-42dc0b638bef" (UID: "5cd9d81f-8c8a-46f9-9943-42dc0b638bef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.689871 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.696193 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-kube-api-access-sgm22" (OuterVolumeSpecName: "kube-api-access-sgm22") pod "5cd9d81f-8c8a-46f9-9943-42dc0b638bef" (UID: "5cd9d81f-8c8a-46f9-9943-42dc0b638bef"). InnerVolumeSpecName "kube-api-access-sgm22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.697099 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9531f5f9-8f77-4882-b779-4210b1de81de-kube-api-access-gf22x" (OuterVolumeSpecName: "kube-api-access-gf22x") pod "9531f5f9-8f77-4882-b779-4210b1de81de" (UID: "9531f5f9-8f77-4882-b779-4210b1de81de"). InnerVolumeSpecName "kube-api-access-gf22x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.698504 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-scripts" (OuterVolumeSpecName: "scripts") pod "5cd9d81f-8c8a-46f9-9943-42dc0b638bef" (UID: "5cd9d81f-8c8a-46f9-9943-42dc0b638bef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.715683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-config-data" (OuterVolumeSpecName: "config-data") pod "5cd9d81f-8c8a-46f9-9943-42dc0b638bef" (UID: "5cd9d81f-8c8a-46f9-9943-42dc0b638bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.716553 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cd9d81f-8c8a-46f9-9943-42dc0b638bef" (UID: "5cd9d81f-8c8a-46f9-9943-42dc0b638bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.718266 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-config" (OuterVolumeSpecName: "config") pod "9531f5f9-8f77-4882-b779-4210b1de81de" (UID: "9531f5f9-8f77-4882-b779-4210b1de81de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.734091 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9531f5f9-8f77-4882-b779-4210b1de81de" (UID: "9531f5f9-8f77-4882-b779-4210b1de81de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.792130 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.792168 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.792181 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf22x\" (UniqueName: \"kubernetes.io/projected/9531f5f9-8f77-4882-b779-4210b1de81de-kube-api-access-gf22x\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.792195 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.792209 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgm22\" (UniqueName: \"kubernetes.io/projected/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-kube-api-access-sgm22\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.792220 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cd9d81f-8c8a-46f9-9943-42dc0b638bef-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:47 crc kubenswrapper[4775]: I0321 05:07:47.792231 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9531f5f9-8f77-4882-b779-4210b1de81de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.305377 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wvjlb" event={"ID":"9531f5f9-8f77-4882-b779-4210b1de81de","Type":"ContainerDied","Data":"a0dba2a88554c5f818ac14151546ad519e642afe620cd4a8580aa68fd7107a3c"} Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.305754 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0dba2a88554c5f818ac14151546ad519e642afe620cd4a8580aa68fd7107a3c" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.305483 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wvjlb" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.307911 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dh4pj" event={"ID":"5cd9d81f-8c8a-46f9-9943-42dc0b638bef","Type":"ContainerDied","Data":"f10dda8724855df58c36f76f7e138a9aa8f4ab4f82a8c7d9fc8bb963f6bad44c"} Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.307958 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f10dda8724855df58c36f76f7e138a9aa8f4ab4f82a8c7d9fc8bb963f6bad44c" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.308007 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dh4pj" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.620895 4775 scope.go:117] "RemoveContainer" containerID="b0cd245fc2b0c70eaf76803ab7a80bbc622111a54e050582ef48868a191f3421" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.827178 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58948d8bb4-rcw89"] Mar 21 05:07:48 crc kubenswrapper[4775]: E0321 05:07:48.827840 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9531f5f9-8f77-4882-b779-4210b1de81de" containerName="neutron-db-sync" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.827853 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9531f5f9-8f77-4882-b779-4210b1de81de" containerName="neutron-db-sync" Mar 21 05:07:48 crc kubenswrapper[4775]: E0321 05:07:48.827874 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cd9d81f-8c8a-46f9-9943-42dc0b638bef" containerName="placement-db-sync" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.827880 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cd9d81f-8c8a-46f9-9943-42dc0b638bef" containerName="placement-db-sync" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.828035 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9531f5f9-8f77-4882-b779-4210b1de81de" containerName="neutron-db-sync" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.828062 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cd9d81f-8c8a-46f9-9943-42dc0b638bef" containerName="placement-db-sync" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.828991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.831360 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.831468 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.831576 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.831744 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8q2g8" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.831887 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.846873 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58948d8bb4-rcw89"] Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.859475 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lk5mk"] Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.860789 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.913698 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lk5mk"] Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.914732 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.914791 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-internal-tls-certs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.914823 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r94w\" (UniqueName: \"kubernetes.io/projected/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-kube-api-access-2r94w\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.914847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjz8j\" (UniqueName: \"kubernetes.io/projected/279dff90-9d39-418a-b5e7-00333a376d16-kube-api-access-sjz8j\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.914877 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-scripts\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.914903 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-config-data\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.914946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-config\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.915018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-combined-ca-bundle\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.915045 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-public-tls-certs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.915073 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/279dff90-9d39-418a-b5e7-00333a376d16-logs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.915095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-svc\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.915141 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.915168 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:48 crc kubenswrapper[4775]: I0321 05:07:48.979828 4775 scope.go:117] "RemoveContainer" containerID="1d80db6f5c73575575e757ede386b167af7f87a0b631372385a77aa72299bb8b" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.017644 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.028470 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.028584 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-internal-tls-certs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.028656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r94w\" (UniqueName: \"kubernetes.io/projected/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-kube-api-access-2r94w\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.028703 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjz8j\" (UniqueName: \"kubernetes.io/projected/279dff90-9d39-418a-b5e7-00333a376d16-kube-api-access-sjz8j\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.028759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-scripts\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.022869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.022896 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68ccf5bf68-lf5dz"] Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.030409 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.028814 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-config-data\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.030682 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-config\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.030957 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-combined-ca-bundle\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.031192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-public-tls-certs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.032223 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/279dff90-9d39-418a-b5e7-00333a376d16-logs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.032362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-svc\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.032508 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.033730 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/279dff90-9d39-418a-b5e7-00333a376d16-logs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.036813 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-svc\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.039170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.039585 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-config\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.043959 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.047174 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68ccf5bf68-lf5dz"] Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.054081 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pms25" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.054419 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.054435 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.054650 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.059025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r94w\" (UniqueName: \"kubernetes.io/projected/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-kube-api-access-2r94w\") pod \"dnsmasq-dns-6b7b667979-lk5mk\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.059598 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.060306 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-config-data\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.063749 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjz8j\" (UniqueName: \"kubernetes.io/projected/279dff90-9d39-418a-b5e7-00333a376d16-kube-api-access-sjz8j\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.063834 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-public-tls-certs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.091697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-scripts\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.097694 4775 scope.go:117] "RemoveContainer" containerID="32f21565b6fc51221eac38731ca38e1d9609cbf9ce961697af6d2d9ccdf3f74f" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.098399 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-internal-tls-certs\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.100706 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279dff90-9d39-418a-b5e7-00333a376d16-combined-ca-bundle\") pod \"placement-58948d8bb4-rcw89\" (UID: \"279dff90-9d39-418a-b5e7-00333a376d16\") " pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.136455 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-764f7\" (UniqueName: \"kubernetes.io/projected/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-kube-api-access-764f7\") pod \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.136529 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-combined-ca-bundle\") pod \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.136565 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-scripts\") pod \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.136597 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-config-data\") pod \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.136756 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-etc-machine-id\") pod \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.136863 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-db-sync-config-data\") pod \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\" (UID: \"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.137105 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-ovndb-tls-certs\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.137160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-httpd-config\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.137192 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-combined-ca-bundle\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.137310 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-config\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.137359 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv44z\" (UniqueName: \"kubernetes.io/projected/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-kube-api-access-bv44z\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.137451 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" (UID: "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.143806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-kube-api-access-764f7" (OuterVolumeSpecName: "kube-api-access-764f7") pod "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" (UID: "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a"). InnerVolumeSpecName "kube-api-access-764f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.146414 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-scripts" (OuterVolumeSpecName: "scripts") pod "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" (UID: "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.146659 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.150030 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.153291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" (UID: "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.166645 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:49 crc kubenswrapper[4775]: E0321 05:07:49.202858 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.216340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" (UID: "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.239745 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-sb\") pod \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.239852 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-config\") pod \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.239886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-svc\") pod \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.239920 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzx7f\" (UniqueName: \"kubernetes.io/projected/aaad9349-8b1e-4f07-b3c9-36bc4781a386-kube-api-access-lzx7f\") pod \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.239985 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-nb\") pod \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.240049 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-swift-storage-0\") pod \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\" (UID: \"aaad9349-8b1e-4f07-b3c9-36bc4781a386\") " Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.240755 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-config\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.240891 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv44z\" (UniqueName: \"kubernetes.io/projected/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-kube-api-access-bv44z\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.240956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-ovndb-tls-certs\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.241003 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-httpd-config\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.241041 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-combined-ca-bundle\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.241240 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.241258 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-764f7\" (UniqueName: \"kubernetes.io/projected/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-kube-api-access-764f7\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.241271 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.241282 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.241294 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.244311 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-config-data" (OuterVolumeSpecName: "config-data") pod "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" (UID: "37bb7e34-ac47-44f6-b18f-ef4ed78eea6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.247550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-ovndb-tls-certs\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.250784 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaad9349-8b1e-4f07-b3c9-36bc4781a386-kube-api-access-lzx7f" (OuterVolumeSpecName: "kube-api-access-lzx7f") pod "aaad9349-8b1e-4f07-b3c9-36bc4781a386" (UID: "aaad9349-8b1e-4f07-b3c9-36bc4781a386"). InnerVolumeSpecName "kube-api-access-lzx7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.258694 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-httpd-config\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.258996 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-combined-ca-bundle\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.267748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv44z\" (UniqueName: \"kubernetes.io/projected/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-kube-api-access-bv44z\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.269432 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-config\") pod \"neutron-68ccf5bf68-lf5dz\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.305734 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aaad9349-8b1e-4f07-b3c9-36bc4781a386" (UID: "aaad9349-8b1e-4f07-b3c9-36bc4781a386"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.309337 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-config" (OuterVolumeSpecName: "config") pod "aaad9349-8b1e-4f07-b3c9-36bc4781a386" (UID: "aaad9349-8b1e-4f07-b3c9-36bc4781a386"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.316150 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aaad9349-8b1e-4f07-b3c9-36bc4781a386" (UID: "aaad9349-8b1e-4f07-b3c9-36bc4781a386"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.316555 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aaad9349-8b1e-4f07-b3c9-36bc4781a386" (UID: "aaad9349-8b1e-4f07-b3c9-36bc4781a386"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.327433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cmq2" event={"ID":"1658991f-7a2b-4ce9-a240-a940385e0b8f","Type":"ContainerStarted","Data":"796b087daa0094c51c318e24d71b74b2beb873149bcfc37bff7fc5e4bd3c8ed7"} Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.333038 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aaad9349-8b1e-4f07-b3c9-36bc4781a386" (UID: "aaad9349-8b1e-4f07-b3c9-36bc4781a386"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.333292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerStarted","Data":"507f523f41fa20fba933e438c92908136440aa69787cc4ade1ab3f5729729a60"} Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.333534 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.333493 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="sg-core" containerID="cri-o://3ffce781b6baa0a2f3ae718a0cd398633d05ff48719dba863adc5152b47fe849" gracePeriod=30 Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.333387 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="ceilometer-notification-agent" containerID="cri-o://4e1e7c557d6fa58625eeced787347009a65ed5b8a3da8a6487f343ffff4e6c90" gracePeriod=30 Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.333506 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="proxy-httpd" containerID="cri-o://507f523f41fa20fba933e438c92908136440aa69787cc4ade1ab3f5729729a60" gracePeriod=30 Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.339645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" event={"ID":"aaad9349-8b1e-4f07-b3c9-36bc4781a386","Type":"ContainerDied","Data":"545d0a4dfbecffb0f9c29e4cf099e382302c9c5682c08fe8a7fbe8e6c0247a35"} Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.339740 4775 scope.go:117] "RemoveContainer" containerID="f2fafd7848327783c5d640051878058b230f83637eec544e85f17df14ee71f99" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.339912 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-4nfhc" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.353918 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.354499 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.354561 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.354615 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.354667 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzx7f\" (UniqueName: \"kubernetes.io/projected/aaad9349-8b1e-4f07-b3c9-36bc4781a386-kube-api-access-lzx7f\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.354736 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.354795 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaad9349-8b1e-4f07-b3c9-36bc4781a386-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.355008 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-l8qqc" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.355341 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-l8qqc" event={"ID":"37bb7e34-ac47-44f6-b18f-ef4ed78eea6a","Type":"ContainerDied","Data":"45cbfcc3ba69b1d165e2a6acc84f20765ed5bc3bad1c098418f5af9089eb0bf1"} Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.355418 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45cbfcc3ba69b1d165e2a6acc84f20765ed5bc3bad1c098418f5af9089eb0bf1" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.359773 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9cmq2" podStartSLOduration=2.187875791 podStartE2EDuration="1m3.359752203s" podCreationTimestamp="2026-03-21 05:06:46 +0000 UTC" firstStartedPulling="2026-03-21 05:06:47.494573548 +0000 UTC m=+1160.471037182" lastFinishedPulling="2026-03-21 05:07:48.66644995 +0000 UTC m=+1221.642913594" observedRunningTime="2026-03-21 05:07:49.346064436 +0000 UTC m=+1222.322528060" watchObservedRunningTime="2026-03-21 05:07:49.359752203 +0000 UTC m=+1222.336215827" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.375226 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.385981 4775 scope.go:117] "RemoveContainer" containerID="4bd42a0432eaefd26033f0ffdb594981243afbef4a05205089435812a0be336b" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.434798 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-4nfhc"] Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.444698 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-4nfhc"] Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.485521 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.512759 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.625196 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lk5mk"] Mar 21 05:07:49 crc kubenswrapper[4775]: W0321 05:07:49.674977 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a4d8e0_d0d0_4b04_9f21_ddf0d94c0e8f.slice/crio-fbe39bc0c49faa915d2c4f50c757fc038107aae3b2ab6a7a97e1e90b02b1d1a4 WatchSource:0}: Error finding container fbe39bc0c49faa915d2c4f50c757fc038107aae3b2ab6a7a97e1e90b02b1d1a4: Status 404 returned error can't find the container with id fbe39bc0c49faa915d2c4f50c757fc038107aae3b2ab6a7a97e1e90b02b1d1a4 Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.704593 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" path="/var/lib/kubelet/pods/aaad9349-8b1e-4f07-b3c9-36bc4781a386/volumes" Mar 21 05:07:49 crc kubenswrapper[4775]: I0321 05:07:49.710504 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58948d8bb4-rcw89"] Mar 21 05:07:49 crc kubenswrapper[4775]: W0321 05:07:49.719897 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279dff90_9d39_418a_b5e7_00333a376d16.slice/crio-7e253dae2491da166ef4e31ce862a09f50e642f143ea394133c2a881166deda6 WatchSource:0}: Error finding container 7e253dae2491da166ef4e31ce862a09f50e642f143ea394133c2a881166deda6: Status 404 returned error can't find the container with id 7e253dae2491da166ef4e31ce862a09f50e642f143ea394133c2a881166deda6 Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.153618 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68ccf5bf68-lf5dz"] Mar 21 05:07:50 crc kubenswrapper[4775]: W0321 05:07:50.154881 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dc5e1ad_635a_478c_8fac_ac1fcdb2bad4.slice/crio-785cf1f0611121bc184a4bc0f34377c346c490853673595b9763600cad5d4149 WatchSource:0}: Error finding container 785cf1f0611121bc184a4bc0f34377c346c490853673595b9763600cad5d4149: Status 404 returned error can't find the container with id 785cf1f0611121bc184a4bc0f34377c346c490853673595b9763600cad5d4149 Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.314269 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:07:50 crc kubenswrapper[4775]: E0321 05:07:50.314733 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerName="dnsmasq-dns" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.314759 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerName="dnsmasq-dns" Mar 21 05:07:50 crc kubenswrapper[4775]: E0321 05:07:50.314776 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerName="init" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.314784 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerName="init" Mar 21 05:07:50 crc kubenswrapper[4775]: E0321 05:07:50.314813 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" containerName="cinder-db-sync" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.314823 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" containerName="cinder-db-sync" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.315037 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" containerName="cinder-db-sync" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.315075 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaad9349-8b1e-4f07-b3c9-36bc4781a386" containerName="dnsmasq-dns" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.316189 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.324606 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nm9jl" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.324858 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.325104 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.325262 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.329558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.375038 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f765864-79ee-415c-99be-55894a20c087-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.375311 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.375339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.375438 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.375457 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.375485 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwll\" (UniqueName: \"kubernetes.io/projected/3f765864-79ee-415c-99be-55894a20c087-kube-api-access-krwll\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.417859 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lk5mk"] Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.421432 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71255c4f-1e47-4e35-845f-876fff5fd6d4","Type":"ContainerStarted","Data":"b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.421641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71255c4f-1e47-4e35-845f-876fff5fd6d4","Type":"ContainerStarted","Data":"5d037f60a7382103283fb7dc977007a3668afac9d065f14a09fa2c15aa21b79e"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.463418 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47e577d7-e389-4135-b4fb-979bd627eaa9","Type":"ContainerStarted","Data":"ea4d39e040f38b33347813ecac429d1646c7747cc95596bc69a68249c6432f4a"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.477492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f765864-79ee-415c-99be-55894a20c087-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.478024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.478110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.478324 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.478402 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.478486 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwll\" (UniqueName: \"kubernetes.io/projected/3f765864-79ee-415c-99be-55894a20c087-kube-api-access-krwll\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.478863 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f765864-79ee-415c-99be-55894a20c087-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.482174 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-774db89647-9pnrf"] Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.483905 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.489884 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.496096 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.500752 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-scripts\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.509205 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.521205 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774db89647-9pnrf"] Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.524937 4775 generic.go:334] "Generic (PLEG): container finished" podID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerID="3ffce781b6baa0a2f3ae718a0cd398633d05ff48719dba863adc5152b47fe849" exitCode=2 Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.525009 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerDied","Data":"3ffce781b6baa0a2f3ae718a0cd398633d05ff48719dba863adc5152b47fe849"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.526614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwll\" (UniqueName: \"kubernetes.io/projected/3f765864-79ee-415c-99be-55894a20c087-kube-api-access-krwll\") pod \"cinder-scheduler-0\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.557476 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68ccf5bf68-lf5dz" event={"ID":"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4","Type":"ContainerStarted","Data":"f14a2e346ace37be895dcc94dd25c65e41716e7e754b07d2e1e3277674aeac90"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.557516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68ccf5bf68-lf5dz" event={"ID":"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4","Type":"ContainerStarted","Data":"785cf1f0611121bc184a4bc0f34377c346c490853673595b9763600cad5d4149"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.558868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58948d8bb4-rcw89" event={"ID":"279dff90-9d39-418a-b5e7-00333a376d16","Type":"ContainerStarted","Data":"e3c653ccf6756c923a2ae6b032f17712a859f9775c3f8a1f6d64239a81f63d2e"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.558889 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58948d8bb4-rcw89" event={"ID":"279dff90-9d39-418a-b5e7-00333a376d16","Type":"ContainerStarted","Data":"548faf90447b1a3a4e1b70a575543b3ccd563821c6c627bcd784e95cf7f5a15f"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.558897 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58948d8bb4-rcw89" event={"ID":"279dff90-9d39-418a-b5e7-00333a376d16","Type":"ContainerStarted","Data":"7e253dae2491da166ef4e31ce862a09f50e642f143ea394133c2a881166deda6"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.559199 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.559241 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.653700 4775 generic.go:334] "Generic (PLEG): container finished" podID="f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" containerID="180859970b041857dafe2a96bb332f8d1442cf9ef9839e6bbd7b85783c3dc75d" exitCode=0 Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.653885 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" event={"ID":"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f","Type":"ContainerDied","Data":"180859970b041857dafe2a96bb332f8d1442cf9ef9839e6bbd7b85783c3dc75d"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.653912 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" event={"ID":"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f","Type":"ContainerStarted","Data":"fbe39bc0c49faa915d2c4f50c757fc038107aae3b2ab6a7a97e1e90b02b1d1a4"} Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.654851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.655028 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-config\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.657898 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-svc\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.660773 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.665232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4hqb\" (UniqueName: \"kubernetes.io/projected/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-kube-api-access-c4hqb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.665339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.665461 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.737069 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58948d8bb4-rcw89" podStartSLOduration=2.7370444860000003 podStartE2EDuration="2.737044486s" podCreationTimestamp="2026-03-21 05:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:50.644285214 +0000 UTC m=+1223.620748838" watchObservedRunningTime="2026-03-21 05:07:50.737044486 +0000 UTC m=+1223.713508110" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.771920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-config\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.772325 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-svc\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.772351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4hqb\" (UniqueName: \"kubernetes.io/projected/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-kube-api-access-c4hqb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.772405 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.772461 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.772501 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.795711 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.805933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-svc\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.806870 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-config\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.807519 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.808582 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.838084 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4hqb\" (UniqueName: \"kubernetes.io/projected/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-kube-api-access-c4hqb\") pod \"dnsmasq-dns-774db89647-9pnrf\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.858587 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.860166 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.872290 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.889098 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.906146 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.979197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.979576 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb53eb05-afa8-4039-805c-477b25ed3075-logs\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.979715 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.979917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.980068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb53eb05-afa8-4039-805c-477b25ed3075-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.980156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctknn\" (UniqueName: \"kubernetes.io/projected/cb53eb05-afa8-4039-805c-477b25ed3075-kube-api-access-ctknn\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:50 crc kubenswrapper[4775]: I0321 05:07:50.980250 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-scripts\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.082451 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.082530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb53eb05-afa8-4039-805c-477b25ed3075-logs\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.082554 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.082590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.082661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb53eb05-afa8-4039-805c-477b25ed3075-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.082691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctknn\" (UniqueName: \"kubernetes.io/projected/cb53eb05-afa8-4039-805c-477b25ed3075-kube-api-access-ctknn\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.082723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-scripts\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.083974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb53eb05-afa8-4039-805c-477b25ed3075-logs\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.084591 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb53eb05-afa8-4039-805c-477b25ed3075-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.088137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.088833 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-scripts\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.091602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.092027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.105039 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctknn\" (UniqueName: \"kubernetes.io/projected/cb53eb05-afa8-4039-805c-477b25ed3075-kube-api-access-ctknn\") pod \"cinder-api-0\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.219885 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:07:51 crc kubenswrapper[4775]: E0321 05:07:51.308160 4775 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 21 05:07:51 crc kubenswrapper[4775]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 21 05:07:51 crc kubenswrapper[4775]: > podSandboxID="fbe39bc0c49faa915d2c4f50c757fc038107aae3b2ab6a7a97e1e90b02b1d1a4" Mar 21 05:07:51 crc kubenswrapper[4775]: E0321 05:07:51.308368 4775 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 05:07:51 crc kubenswrapper[4775]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n548h78h598h67bh67chf7h67dhch88h5bh596h546h5b9h5b7h57hd9h58h5dbh674hfch548h549h99hbch5c9h546h687h687h648hd4h8bh98q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2r94w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6b7b667979-lk5mk_openstack(f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 21 05:07:51 crc kubenswrapper[4775]: > logger="UnhandledError" Mar 21 05:07:51 crc kubenswrapper[4775]: E0321 05:07:51.309518 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" podUID="f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.339932 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:07:51 crc kubenswrapper[4775]: W0321 05:07:51.394548 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f765864_79ee_415c_99be_55894a20c087.slice/crio-85e53e5965bf199e0c4123a3954f4e5d437587f60519d3c96ac82b5f50bb3679 WatchSource:0}: Error finding container 85e53e5965bf199e0c4123a3954f4e5d437587f60519d3c96ac82b5f50bb3679: Status 404 returned error can't find the container with id 85e53e5965bf199e0c4123a3954f4e5d437587f60519d3c96ac82b5f50bb3679 Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.492893 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774db89647-9pnrf"] Mar 21 05:07:51 crc kubenswrapper[4775]: W0321 05:07:51.509198 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b34a259_4754_41ea_b3b7_cd1ba5fb53c6.slice/crio-a73be0fd90826150202a28a361d434d06f770ab26ca7c2e2a7f4cd5e030488a3 WatchSource:0}: Error finding container a73be0fd90826150202a28a361d434d06f770ab26ca7c2e2a7f4cd5e030488a3: Status 404 returned error can't find the container with id a73be0fd90826150202a28a361d434d06f770ab26ca7c2e2a7f4cd5e030488a3 Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.682368 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47e577d7-e389-4135-b4fb-979bd627eaa9","Type":"ContainerStarted","Data":"0173c3dcb2f1ee362b645abaeead4a322f137b6409341a3b746ad5f19cffafb5"} Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.682966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47e577d7-e389-4135-b4fb-979bd627eaa9","Type":"ContainerStarted","Data":"29a18bd4308c28984ec8d5e5e490324db58708c1d07372b24ee7783e1995157d"} Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.685833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68ccf5bf68-lf5dz" event={"ID":"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4","Type":"ContainerStarted","Data":"06c961a967b1c07bf826487fd3c050eb92ff3106f515b9136df5283020977b92"} Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.685973 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.691705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71255c4f-1e47-4e35-845f-876fff5fd6d4","Type":"ContainerStarted","Data":"9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405"} Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.694437 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-9pnrf" event={"ID":"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6","Type":"ContainerStarted","Data":"a73be0fd90826150202a28a361d434d06f770ab26ca7c2e2a7f4cd5e030488a3"} Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.705561 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f765864-79ee-415c-99be-55894a20c087","Type":"ContainerStarted","Data":"85e53e5965bf199e0c4123a3954f4e5d437587f60519d3c96ac82b5f50bb3679"} Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.709706 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.711625 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.711609828 podStartE2EDuration="8.711609828s" podCreationTimestamp="2026-03-21 05:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:51.695339198 +0000 UTC m=+1224.671802822" watchObservedRunningTime="2026-03-21 05:07:51.711609828 +0000 UTC m=+1224.688073452" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.756066 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.7560453339999995 podStartE2EDuration="6.756045334s" podCreationTimestamp="2026-03-21 05:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:51.71945889 +0000 UTC m=+1224.695922524" watchObservedRunningTime="2026-03-21 05:07:51.756045334 +0000 UTC m=+1224.732508968" Mar 21 05:07:51 crc kubenswrapper[4775]: I0321 05:07:51.761441 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68ccf5bf68-lf5dz" podStartSLOduration=3.761423136 podStartE2EDuration="3.761423136s" podCreationTimestamp="2026-03-21 05:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:51.746180845 +0000 UTC m=+1224.722644479" watchObservedRunningTime="2026-03-21 05:07:51.761423136 +0000 UTC m=+1224.737886760" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.099485 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.206171 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-config\") pod \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.206237 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r94w\" (UniqueName: \"kubernetes.io/projected/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-kube-api-access-2r94w\") pod \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.206300 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-swift-storage-0\") pod \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.206319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-sb\") pod \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.206461 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-svc\") pod \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.206503 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-nb\") pod \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\" (UID: \"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f\") " Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.214324 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-kube-api-access-2r94w" (OuterVolumeSpecName: "kube-api-access-2r94w") pod "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" (UID: "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f"). InnerVolumeSpecName "kube-api-access-2r94w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.261031 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" (UID: "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.264791 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-config" (OuterVolumeSpecName: "config") pod "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" (UID: "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.266568 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" (UID: "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.266701 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" (UID: "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.284089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" (UID: "f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.307988 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.308014 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.308024 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.308034 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r94w\" (UniqueName: \"kubernetes.io/projected/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-kube-api-access-2r94w\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.308044 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.308052 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.725281 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" event={"ID":"f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f","Type":"ContainerDied","Data":"fbe39bc0c49faa915d2c4f50c757fc038107aae3b2ab6a7a97e1e90b02b1d1a4"} Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.725545 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-lk5mk" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.725565 4775 scope.go:117] "RemoveContainer" containerID="180859970b041857dafe2a96bb332f8d1442cf9ef9839e6bbd7b85783c3dc75d" Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.731090 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb53eb05-afa8-4039-805c-477b25ed3075","Type":"ContainerStarted","Data":"7645cb41a5451f5509943511872ab22f39410088b4a0322f6df0171a3ffb7fc5"} Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.731185 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb53eb05-afa8-4039-805c-477b25ed3075","Type":"ContainerStarted","Data":"a2b7bcfad5b3fa6e886cea0c4b4a9a0a0cb851a1435a0c4faa718075cee23435"} Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.735919 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerID="a5baf6934d04dd62ef1e7d5ae437ae7f3e8c190bcbdb11263300fe3c8b4015ff" exitCode=0 Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.736007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-9pnrf" event={"ID":"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6","Type":"ContainerDied","Data":"a5baf6934d04dd62ef1e7d5ae437ae7f3e8c190bcbdb11263300fe3c8b4015ff"} Mar 21 05:07:52 crc kubenswrapper[4775]: I0321 05:07:52.999188 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lk5mk"] Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.029187 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-lk5mk"] Mar 21 05:07:53 crc kubenswrapper[4775]: E0321 05:07:53.439984 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ac5c16c_56ac_4299_ae61_a8200986ce10.slice/crio-conmon-4e1e7c557d6fa58625eeced787347009a65ed5b8a3da8a6487f343ffff4e6c90.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ac5c16c_56ac_4299_ae61_a8200986ce10.slice/crio-4e1e7c557d6fa58625eeced787347009a65ed5b8a3da8a6487f343ffff4e6c90.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.648961 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.649057 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.676297 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" path="/var/lib/kubelet/pods/f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f/volumes" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.681880 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.709048 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.748139 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb53eb05-afa8-4039-805c-477b25ed3075","Type":"ContainerStarted","Data":"d8c073f8b7d47daaf28becd8c93adbf3ccc063178d35dc0cd6aeea148ed3060c"} Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.748516 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.750286 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-9pnrf" event={"ID":"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6","Type":"ContainerStarted","Data":"b889dfec32da8490ede4a5642cd397e9def74cf497031b982af0df0b06f79197"} Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.750465 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.752789 4775 generic.go:334] "Generic (PLEG): container finished" podID="1658991f-7a2b-4ce9-a240-a940385e0b8f" containerID="796b087daa0094c51c318e24d71b74b2beb873149bcfc37bff7fc5e4bd3c8ed7" exitCode=0 Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.752861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cmq2" event={"ID":"1658991f-7a2b-4ce9-a240-a940385e0b8f","Type":"ContainerDied","Data":"796b087daa0094c51c318e24d71b74b2beb873149bcfc37bff7fc5e4bd3c8ed7"} Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.754569 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f765864-79ee-415c-99be-55894a20c087","Type":"ContainerStarted","Data":"4764d465ee215948cf3a07d17ef04460952c808ab216c2d6600d30877448b571"} Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.756711 4775 generic.go:334] "Generic (PLEG): container finished" podID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerID="4e1e7c557d6fa58625eeced787347009a65ed5b8a3da8a6487f343ffff4e6c90" exitCode=0 Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.756802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerDied","Data":"4e1e7c557d6fa58625eeced787347009a65ed5b8a3da8a6487f343ffff4e6c90"} Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.758600 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.758662 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.778771 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.778746196 podStartE2EDuration="3.778746196s" podCreationTimestamp="2026-03-21 05:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:53.769388261 +0000 UTC m=+1226.745851915" watchObservedRunningTime="2026-03-21 05:07:53.778746196 +0000 UTC m=+1226.755209820" Mar 21 05:07:53 crc kubenswrapper[4775]: I0321 05:07:53.831796 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-774db89647-9pnrf" podStartSLOduration=3.831772184 podStartE2EDuration="3.831772184s" podCreationTimestamp="2026-03-21 05:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:53.807865009 +0000 UTC m=+1226.784328633" watchObservedRunningTime="2026-03-21 05:07:53.831772184 +0000 UTC m=+1226.808235818" Mar 21 05:07:54 crc kubenswrapper[4775]: I0321 05:07:54.772953 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f765864-79ee-415c-99be-55894a20c087","Type":"ContainerStarted","Data":"0b6150bc28d08d5b96097378ba0b885d4bf49a94a51c13823ebc2ecd1793e28b"} Mar 21 05:07:54 crc kubenswrapper[4775]: I0321 05:07:54.791518 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.967906501 podStartE2EDuration="4.791500357s" podCreationTimestamp="2026-03-21 05:07:50 +0000 UTC" firstStartedPulling="2026-03-21 05:07:51.41270204 +0000 UTC m=+1224.389165664" lastFinishedPulling="2026-03-21 05:07:52.236295896 +0000 UTC m=+1225.212759520" observedRunningTime="2026-03-21 05:07:54.791389804 +0000 UTC m=+1227.767853428" watchObservedRunningTime="2026-03-21 05:07:54.791500357 +0000 UTC m=+1227.767963981" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.211714 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.262268 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bczb\" (UniqueName: \"kubernetes.io/projected/1658991f-7a2b-4ce9-a240-a940385e0b8f-kube-api-access-9bczb\") pod \"1658991f-7a2b-4ce9-a240-a940385e0b8f\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.262554 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-db-sync-config-data\") pod \"1658991f-7a2b-4ce9-a240-a940385e0b8f\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.262753 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-combined-ca-bundle\") pod \"1658991f-7a2b-4ce9-a240-a940385e0b8f\" (UID: \"1658991f-7a2b-4ce9-a240-a940385e0b8f\") " Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.270240 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1658991f-7a2b-4ce9-a240-a940385e0b8f" (UID: "1658991f-7a2b-4ce9-a240-a940385e0b8f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.271325 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1658991f-7a2b-4ce9-a240-a940385e0b8f-kube-api-access-9bczb" (OuterVolumeSpecName: "kube-api-access-9bczb") pod "1658991f-7a2b-4ce9-a240-a940385e0b8f" (UID: "1658991f-7a2b-4ce9-a240-a940385e0b8f"). InnerVolumeSpecName "kube-api-access-9bczb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.290373 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1658991f-7a2b-4ce9-a240-a940385e0b8f" (UID: "1658991f-7a2b-4ce9-a240-a940385e0b8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.364716 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bczb\" (UniqueName: \"kubernetes.io/projected/1658991f-7a2b-4ce9-a240-a940385e0b8f-kube-api-access-9bczb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.364758 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.364769 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1658991f-7a2b-4ce9-a240-a940385e0b8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.476373 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.590630 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d65998c7c-prp5b"] Mar 21 05:07:55 crc kubenswrapper[4775]: E0321 05:07:55.591058 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1658991f-7a2b-4ce9-a240-a940385e0b8f" containerName="barbican-db-sync" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.591073 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1658991f-7a2b-4ce9-a240-a940385e0b8f" containerName="barbican-db-sync" Mar 21 05:07:55 crc kubenswrapper[4775]: E0321 05:07:55.591086 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" containerName="init" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.591095 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" containerName="init" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.591321 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1658991f-7a2b-4ce9-a240-a940385e0b8f" containerName="barbican-db-sync" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.591350 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a4d8e0-d0d0-4b04-9f21-ddf0d94c0e8f" containerName="init" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.592393 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.595160 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.595948 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.596695 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d65998c7c-prp5b"] Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.671495 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.674708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-httpd-config\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.674771 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-internal-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.674813 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxrd\" (UniqueName: \"kubernetes.io/projected/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-kube-api-access-llxrd\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.674874 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-public-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.674904 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-config\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.674950 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-combined-ca-bundle\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.674966 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-ovndb-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.734007 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.734061 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.770932 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.775897 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-combined-ca-bundle\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.775943 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-ovndb-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.775992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-httpd-config\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.776044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-internal-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.776076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llxrd\" (UniqueName: \"kubernetes.io/projected/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-kube-api-access-llxrd\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.776170 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-public-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.776196 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-config\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.780019 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-internal-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.781874 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-ovndb-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.783535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-combined-ca-bundle\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.784292 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-config\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.785070 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-public-tls-certs\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.787761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-httpd-config\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.799698 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.805134 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llxrd\" (UniqueName: \"kubernetes.io/projected/3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef-kube-api-access-llxrd\") pod \"neutron-7d65998c7c-prp5b\" (UID: \"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef\") " pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.806550 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api-log" containerID="cri-o://7645cb41a5451f5509943511872ab22f39410088b4a0322f6df0171a3ffb7fc5" gracePeriod=30 Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.806823 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9cmq2" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.807461 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9cmq2" event={"ID":"1658991f-7a2b-4ce9-a240-a940385e0b8f","Type":"ContainerDied","Data":"cb42616105e59d6830a168b6da9f0c076bcf19cd878442d47ceb6cd33726aca0"} Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.807482 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb42616105e59d6830a168b6da9f0c076bcf19cd878442d47ceb6cd33726aca0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.808719 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api" containerID="cri-o://d8c073f8b7d47daaf28becd8c93adbf3ccc063178d35dc0cd6aeea148ed3060c" gracePeriod=30 Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.808912 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.808927 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.913217 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.982481 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75f5b547c8-mgjw5"] Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.984333 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.988480 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.989063 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 21 05:07:55 crc kubenswrapper[4775]: I0321 05:07:55.989455 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ph6zz" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.003387 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6dd9f89d55-fdf8c"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.005279 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.009020 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.045310 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6dd9f89d55-fdf8c"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.083342 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75f5b547c8-mgjw5"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.161179 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-9pnrf"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.161813 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-774db89647-9pnrf" podUID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerName="dnsmasq-dns" containerID="cri-o://b889dfec32da8490ede4a5642cd397e9def74cf497031b982af0df0b06f79197" gracePeriod=10 Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188062 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxt6r\" (UniqueName: \"kubernetes.io/projected/177228f6-7f69-49c2-9942-ea0a98b56d13-kube-api-access-hxt6r\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-config-data-custom\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-config-data-custom\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwjfh\" (UniqueName: \"kubernetes.io/projected/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-kube-api-access-pwjfh\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188255 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/177228f6-7f69-49c2-9942-ea0a98b56d13-logs\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188279 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-config-data\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188319 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-logs\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188363 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-config-data\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188387 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-combined-ca-bundle\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.188425 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-combined-ca-bundle\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.243529 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w2ltw"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.245007 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.289913 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxt6r\" (UniqueName: \"kubernetes.io/projected/177228f6-7f69-49c2-9942-ea0a98b56d13-kube-api-access-hxt6r\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.289962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-config-data-custom\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290006 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-config-data-custom\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290035 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwjfh\" (UniqueName: \"kubernetes.io/projected/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-kube-api-access-pwjfh\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/177228f6-7f69-49c2-9942-ea0a98b56d13-logs\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290091 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-config-data\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-logs\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290179 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-config-data\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290201 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-combined-ca-bundle\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.290230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-combined-ca-bundle\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.291794 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-logs\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.292014 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/177228f6-7f69-49c2-9942-ea0a98b56d13-logs\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.295797 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w2ltw"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.297007 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-config-data\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.305359 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-combined-ca-bundle\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.305444 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7dc8b6bf8b-btcqr"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.320869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-config-data\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.334735 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.337411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.338638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-combined-ca-bundle\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.341260 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-config-data-custom\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.344330 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwjfh\" (UniqueName: \"kubernetes.io/projected/d8a7c2e5-3643-4675-9888-3c310e4f9ad4-kube-api-access-pwjfh\") pod \"barbican-worker-6dd9f89d55-fdf8c\" (UID: \"d8a7c2e5-3643-4675-9888-3c310e4f9ad4\") " pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.345509 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dc8b6bf8b-btcqr"] Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.350995 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/177228f6-7f69-49c2-9942-ea0a98b56d13-config-data-custom\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.366804 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dd9f89d55-fdf8c" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.368995 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxt6r\" (UniqueName: \"kubernetes.io/projected/177228f6-7f69-49c2-9942-ea0a98b56d13-kube-api-access-hxt6r\") pod \"barbican-keystone-listener-75f5b547c8-mgjw5\" (UID: \"177228f6-7f69-49c2-9942-ea0a98b56d13\") " pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.392867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-config\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.393141 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbc62\" (UniqueName: \"kubernetes.io/projected/d7dc929b-501f-43cb-8811-394841cc54f9-kube-api-access-tbc62\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.393181 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-svc\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.393220 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.393280 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.393298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data-custom\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496772 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496803 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-combined-ca-bundle\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496831 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496853 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496914 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-config\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496936 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbc62\" (UniqueName: \"kubernetes.io/projected/d7dc929b-501f-43cb-8811-394841cc54f9-kube-api-access-tbc62\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-svc\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.496983 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-logs\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.497029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.497062 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzmg\" (UniqueName: \"kubernetes.io/projected/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-kube-api-access-6bzmg\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.501041 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.501062 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.501816 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-svc\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.501889 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-config\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.501832 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.591003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbc62\" (UniqueName: \"kubernetes.io/projected/d7dc929b-501f-43cb-8811-394841cc54f9-kube-api-access-tbc62\") pod \"dnsmasq-dns-6578955fd5-w2ltw\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.598425 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-logs\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.599259 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzmg\" (UniqueName: \"kubernetes.io/projected/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-kube-api-access-6bzmg\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.599412 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data-custom\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.599519 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.599634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-combined-ca-bundle\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.600298 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-logs\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.609951 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data-custom\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.610989 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-combined-ca-bundle\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.612171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.649796 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzmg\" (UniqueName: \"kubernetes.io/projected/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-kube-api-access-6bzmg\") pod \"barbican-api-7dc8b6bf8b-btcqr\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.660733 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.834971 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.847795 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.875515 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerID="b889dfec32da8490ede4a5642cd397e9def74cf497031b982af0df0b06f79197" exitCode=0 Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.875651 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-9pnrf" event={"ID":"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6","Type":"ContainerDied","Data":"b889dfec32da8490ede4a5642cd397e9def74cf497031b982af0df0b06f79197"} Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.897925 4775 generic.go:334] "Generic (PLEG): container finished" podID="cb53eb05-afa8-4039-805c-477b25ed3075" containerID="d8c073f8b7d47daaf28becd8c93adbf3ccc063178d35dc0cd6aeea148ed3060c" exitCode=0 Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.897955 4775 generic.go:334] "Generic (PLEG): container finished" podID="cb53eb05-afa8-4039-805c-477b25ed3075" containerID="7645cb41a5451f5509943511872ab22f39410088b4a0322f6df0171a3ffb7fc5" exitCode=143 Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.898849 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb53eb05-afa8-4039-805c-477b25ed3075","Type":"ContainerDied","Data":"d8c073f8b7d47daaf28becd8c93adbf3ccc063178d35dc0cd6aeea148ed3060c"} Mar 21 05:07:56 crc kubenswrapper[4775]: I0321 05:07:56.898934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb53eb05-afa8-4039-805c-477b25ed3075","Type":"ContainerDied","Data":"7645cb41a5451f5509943511872ab22f39410088b4a0322f6df0171a3ffb7fc5"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.046473 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d65998c7c-prp5b"] Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.054240 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.099975 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.114658 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data-custom\") pod \"cb53eb05-afa8-4039-805c-477b25ed3075\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.114732 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctknn\" (UniqueName: \"kubernetes.io/projected/cb53eb05-afa8-4039-805c-477b25ed3075-kube-api-access-ctknn\") pod \"cb53eb05-afa8-4039-805c-477b25ed3075\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.114828 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-combined-ca-bundle\") pod \"cb53eb05-afa8-4039-805c-477b25ed3075\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.114887 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-scripts\") pod \"cb53eb05-afa8-4039-805c-477b25ed3075\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.114903 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb53eb05-afa8-4039-805c-477b25ed3075-logs\") pod \"cb53eb05-afa8-4039-805c-477b25ed3075\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.114934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb53eb05-afa8-4039-805c-477b25ed3075-etc-machine-id\") pod \"cb53eb05-afa8-4039-805c-477b25ed3075\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.114961 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data\") pod \"cb53eb05-afa8-4039-805c-477b25ed3075\" (UID: \"cb53eb05-afa8-4039-805c-477b25ed3075\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.131657 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb53eb05-afa8-4039-805c-477b25ed3075-logs" (OuterVolumeSpecName: "logs") pod "cb53eb05-afa8-4039-805c-477b25ed3075" (UID: "cb53eb05-afa8-4039-805c-477b25ed3075"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.132101 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb53eb05-afa8-4039-805c-477b25ed3075-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cb53eb05-afa8-4039-805c-477b25ed3075" (UID: "cb53eb05-afa8-4039-805c-477b25ed3075"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.135962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cb53eb05-afa8-4039-805c-477b25ed3075" (UID: "cb53eb05-afa8-4039-805c-477b25ed3075"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.151470 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb53eb05-afa8-4039-805c-477b25ed3075-kube-api-access-ctknn" (OuterVolumeSpecName: "kube-api-access-ctknn") pod "cb53eb05-afa8-4039-805c-477b25ed3075" (UID: "cb53eb05-afa8-4039-805c-477b25ed3075"). InnerVolumeSpecName "kube-api-access-ctknn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.160354 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-scripts" (OuterVolumeSpecName: "scripts") pod "cb53eb05-afa8-4039-805c-477b25ed3075" (UID: "cb53eb05-afa8-4039-805c-477b25ed3075"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.214296 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb53eb05-afa8-4039-805c-477b25ed3075" (UID: "cb53eb05-afa8-4039-805c-477b25ed3075"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.216675 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-swift-storage-0\") pod \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.216746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4hqb\" (UniqueName: \"kubernetes.io/projected/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-kube-api-access-c4hqb\") pod \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.216862 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-sb\") pod \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.216916 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-svc\") pod \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.216936 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-config\") pod \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.217028 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-nb\") pod \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\" (UID: \"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6\") " Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.217411 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.217421 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb53eb05-afa8-4039-805c-477b25ed3075-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.217430 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.217438 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb53eb05-afa8-4039-805c-477b25ed3075-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.217445 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.217454 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctknn\" (UniqueName: \"kubernetes.io/projected/cb53eb05-afa8-4039-805c-477b25ed3075-kube-api-access-ctknn\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.237535 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6dd9f89d55-fdf8c"] Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.251427 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-kube-api-access-c4hqb" (OuterVolumeSpecName: "kube-api-access-c4hqb") pod "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" (UID: "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6"). InnerVolumeSpecName "kube-api-access-c4hqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.315737 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" (UID: "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.322008 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.322038 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4hqb\" (UniqueName: \"kubernetes.io/projected/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-kube-api-access-c4hqb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.385556 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-config" (OuterVolumeSpecName: "config") pod "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" (UID: "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.388999 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" (UID: "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.419969 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data" (OuterVolumeSpecName: "config-data") pod "cb53eb05-afa8-4039-805c-477b25ed3075" (UID: "cb53eb05-afa8-4039-805c-477b25ed3075"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.424109 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb53eb05-afa8-4039-805c-477b25ed3075-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.424305 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.424317 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.448337 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75f5b547c8-mgjw5"] Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.453688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" (UID: "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.471867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" (UID: "6b34a259-4754-41ea-b3b7-cd1ba5fb53c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.526309 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.526340 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.603959 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w2ltw"] Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.639360 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dc8b6bf8b-btcqr"] Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.908489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" event={"ID":"177228f6-7f69-49c2-9942-ea0a98b56d13","Type":"ContainerStarted","Data":"d36081bb2a1cc1559310140250d86d1092b1275bdc8dd051dd3d2be34341636a"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.913690 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" event={"ID":"d7dc929b-501f-43cb-8811-394841cc54f9","Type":"ContainerStarted","Data":"4431b638594d2b34fefc401fb2d4af3e48aa80179d99ea9d13c2365707189fa3"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.918673 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" event={"ID":"86abc8a1-1d5e-4dad-9e62-4f1c88608b95","Type":"ContainerStarted","Data":"56894b1d2f57fb4749d6394cfec85a58eb1ab9e8c5e0205ea22bb5116fe8c5c9"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.924517 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb53eb05-afa8-4039-805c-477b25ed3075","Type":"ContainerDied","Data":"a2b7bcfad5b3fa6e886cea0c4b4a9a0a0cb851a1435a0c4faa718075cee23435"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.924581 4775 scope.go:117] "RemoveContainer" containerID="d8c073f8b7d47daaf28becd8c93adbf3ccc063178d35dc0cd6aeea148ed3060c" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.924742 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.929937 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-9pnrf" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.929949 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-9pnrf" event={"ID":"6b34a259-4754-41ea-b3b7-cd1ba5fb53c6","Type":"ContainerDied","Data":"a73be0fd90826150202a28a361d434d06f770ab26ca7c2e2a7f4cd5e030488a3"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.933563 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd9f89d55-fdf8c" event={"ID":"d8a7c2e5-3643-4675-9888-3c310e4f9ad4","Type":"ContainerStarted","Data":"c196ea57d4dccfd615b126ec0536341650101ad3ff1e1ea6ab1956d5c98c8a4e"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.941246 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.942698 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d65998c7c-prp5b" event={"ID":"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef","Type":"ContainerStarted","Data":"c6a873d5c7237282e58f0d52558bff33030d6b81e7787de55c1c7bc6d9dcd7b1"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.942754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d65998c7c-prp5b" event={"ID":"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef","Type":"ContainerStarted","Data":"a2f9731ed7da4398238dad98c11c9d541a2a8817fc90ad06c6e43ea8fb25f8b3"} Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.975697 4775 scope.go:117] "RemoveContainer" containerID="7645cb41a5451f5509943511872ab22f39410088b4a0322f6df0171a3ffb7fc5" Mar 21 05:07:57 crc kubenswrapper[4775]: I0321 05:07:57.991176 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.002225 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.036167 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-9pnrf"] Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.049965 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-774db89647-9pnrf"] Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.070331 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:58 crc kubenswrapper[4775]: E0321 05:07:58.070776 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api-log" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.070796 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api-log" Mar 21 05:07:58 crc kubenswrapper[4775]: E0321 05:07:58.070820 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerName="dnsmasq-dns" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.070830 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerName="dnsmasq-dns" Mar 21 05:07:58 crc kubenswrapper[4775]: E0321 05:07:58.070854 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerName="init" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.070862 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerName="init" Mar 21 05:07:58 crc kubenswrapper[4775]: E0321 05:07:58.070875 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.070881 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.071061 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api-log" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.071099 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" containerName="cinder-api" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.071110 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" containerName="dnsmasq-dns" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.072298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.101582 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.102220 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.111261 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.136308 4775 scope.go:117] "RemoveContainer" containerID="b889dfec32da8490ede4a5642cd397e9def74cf497031b982af0df0b06f79197" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144152 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c16d835a-1ec2-473d-b2d8-c8e7c978e140-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16d835a-1ec2-473d-b2d8-c8e7c978e140-logs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144241 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144257 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-config-data\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144311 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-scripts\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144333 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144377 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-config-data-custom\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.144397 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvr69\" (UniqueName: \"kubernetes.io/projected/c16d835a-1ec2-473d-b2d8-c8e7c978e140-kube-api-access-nvr69\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.152432 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.202447 4775 scope.go:117] "RemoveContainer" containerID="a5baf6934d04dd62ef1e7d5ae437ae7f3e8c190bcbdb11263300fe3c8b4015ff" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.245999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16d835a-1ec2-473d-b2d8-c8e7c978e140-logs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.246060 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.246082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-config-data\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.250291 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-scripts\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.250380 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.250439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.250499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-config-data-custom\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.250539 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvr69\" (UniqueName: \"kubernetes.io/projected/c16d835a-1ec2-473d-b2d8-c8e7c978e140-kube-api-access-nvr69\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.250699 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c16d835a-1ec2-473d-b2d8-c8e7c978e140-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.250904 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c16d835a-1ec2-473d-b2d8-c8e7c978e140-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.253729 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16d835a-1ec2-473d-b2d8-c8e7c978e140-logs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.261002 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-config-data-custom\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.263443 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.271915 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.272080 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.273562 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvr69\" (UniqueName: \"kubernetes.io/projected/c16d835a-1ec2-473d-b2d8-c8e7c978e140-kube-api-access-nvr69\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.273790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-config-data\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.285707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16d835a-1ec2-473d-b2d8-c8e7c978e140-scripts\") pod \"cinder-api-0\" (UID: \"c16d835a-1ec2-473d-b2d8-c8e7c978e140\") " pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.501814 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.555767 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:07:58 crc kubenswrapper[4775]: I0321 05:07:58.660017 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:58.997146 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d65998c7c-prp5b" event={"ID":"3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef","Type":"ContainerStarted","Data":"66d5512823bb0c671e4b2e3db5aaa6a1ed79a9451ed17faf46bfaddedc2a672c"} Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:58.997447 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.021912 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d65998c7c-prp5b" podStartSLOduration=4.02189394 podStartE2EDuration="4.02189394s" podCreationTimestamp="2026-03-21 05:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:59.018924056 +0000 UTC m=+1231.995387680" watchObservedRunningTime="2026-03-21 05:07:59.02189394 +0000 UTC m=+1231.998357554" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.022141 4775 generic.go:334] "Generic (PLEG): container finished" podID="d7dc929b-501f-43cb-8811-394841cc54f9" containerID="6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25" exitCode=0 Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.022207 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" event={"ID":"d7dc929b-501f-43cb-8811-394841cc54f9","Type":"ContainerDied","Data":"6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25"} Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.029171 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" event={"ID":"86abc8a1-1d5e-4dad-9e62-4f1c88608b95","Type":"ContainerStarted","Data":"9e09a5090c4d61aabf6c91b2097f27d3892ae005a5644d72b8acf7ad49174027"} Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.029407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" event={"ID":"86abc8a1-1d5e-4dad-9e62-4f1c88608b95","Type":"ContainerStarted","Data":"da40033a5dd52e534fbf0e897d41a379c55c497ad81da6c097f013d7acf158d3"} Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.030010 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.030150 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.098609 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podStartSLOduration=3.098588838 podStartE2EDuration="3.098588838s" podCreationTimestamp="2026-03-21 05:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:07:59.078074758 +0000 UTC m=+1232.054538382" watchObservedRunningTime="2026-03-21 05:07:59.098588838 +0000 UTC m=+1232.075052462" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.150688 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.234892 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.235086 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.676665 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b34a259-4754-41ea-b3b7-cd1ba5fb53c6" path="/var/lib/kubelet/pods/6b34a259-4754-41ea-b3b7-cd1ba5fb53c6/volumes" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.677817 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb53eb05-afa8-4039-805c-477b25ed3075" path="/var/lib/kubelet/pods/cb53eb05-afa8-4039-805c-477b25ed3075/volumes" Mar 21 05:07:59 crc kubenswrapper[4775]: I0321 05:07:59.974582 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.072584 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77fd86567d-mf2wb"] Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.075995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.078597 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.078787 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.090339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c16d835a-1ec2-473d-b2d8-c8e7c978e140","Type":"ContainerStarted","Data":"e7f287bd42031f92358435f1c28406a556f8b03cbd50eec4d7c178aa7bbaac92"} Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.095908 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77fd86567d-mf2wb"] Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.113034 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" event={"ID":"d7dc929b-501f-43cb-8811-394841cc54f9","Type":"ContainerStarted","Data":"07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e"} Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.113848 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.141727 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" podStartSLOduration=4.141704707 podStartE2EDuration="4.141704707s" podCreationTimestamp="2026-03-21 05:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:00.141567623 +0000 UTC m=+1233.118031247" watchObservedRunningTime="2026-03-21 05:08:00.141704707 +0000 UTC m=+1233.118168331" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.178200 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567828-6klnl"] Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.179512 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-6klnl" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.182492 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.182666 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.183181 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.192435 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-config-data\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.192492 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-internal-tls-certs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.192522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1727040d-36f5-431c-b8f1-84e206146dcf-logs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.192551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-config-data-custom\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.192579 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49fqs\" (UniqueName: \"kubernetes.io/projected/1727040d-36f5-431c-b8f1-84e206146dcf-kube-api-access-49fqs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.192597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-combined-ca-bundle\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.192675 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-public-tls-certs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.194500 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-6klnl"] Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294246 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-internal-tls-certs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294341 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1727040d-36f5-431c-b8f1-84e206146dcf-logs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294432 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-config-data-custom\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294484 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49fqs\" (UniqueName: \"kubernetes.io/projected/1727040d-36f5-431c-b8f1-84e206146dcf-kube-api-access-49fqs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294523 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-combined-ca-bundle\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294559 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4qg\" (UniqueName: \"kubernetes.io/projected/eccf102b-f91a-4251-b878-1c21eea92522-kube-api-access-gt4qg\") pod \"auto-csr-approver-29567828-6klnl\" (UID: \"eccf102b-f91a-4251-b878-1c21eea92522\") " pod="openshift-infra/auto-csr-approver-29567828-6klnl" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-public-tls-certs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.294891 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-config-data\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.296393 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1727040d-36f5-431c-b8f1-84e206146dcf-logs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.300629 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-internal-tls-certs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.304777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-combined-ca-bundle\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.304869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-config-data\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.325153 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-public-tls-certs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.352714 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fqs\" (UniqueName: \"kubernetes.io/projected/1727040d-36f5-431c-b8f1-84e206146dcf-kube-api-access-49fqs\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.386158 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1727040d-36f5-431c-b8f1-84e206146dcf-config-data-custom\") pod \"barbican-api-77fd86567d-mf2wb\" (UID: \"1727040d-36f5-431c-b8f1-84e206146dcf\") " pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.398216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4qg\" (UniqueName: \"kubernetes.io/projected/eccf102b-f91a-4251-b878-1c21eea92522-kube-api-access-gt4qg\") pod \"auto-csr-approver-29567828-6klnl\" (UID: \"eccf102b-f91a-4251-b878-1c21eea92522\") " pod="openshift-infra/auto-csr-approver-29567828-6klnl" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.401548 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.439436 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4qg\" (UniqueName: \"kubernetes.io/projected/eccf102b-f91a-4251-b878-1c21eea92522-kube-api-access-gt4qg\") pod \"auto-csr-approver-29567828-6klnl\" (UID: \"eccf102b-f91a-4251-b878-1c21eea92522\") " pod="openshift-infra/auto-csr-approver-29567828-6klnl" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.512620 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-6klnl" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.874459 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6496ddbdd4-v5mc5" Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.959361 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64fb567758-hd2ld"] Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.959781 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon-log" containerID="cri-o://4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd" gracePeriod=30 Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.959980 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" containerID="cri-o://1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa" gracePeriod=30 Mar 21 05:08:00 crc kubenswrapper[4775]: I0321 05:08:00.976596 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 21 05:08:01 crc kubenswrapper[4775]: I0321 05:08:01.012708 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 05:08:01 crc kubenswrapper[4775]: I0321 05:08:01.051973 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:08:01 crc kubenswrapper[4775]: I0321 05:08:01.121763 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c16d835a-1ec2-473d-b2d8-c8e7c978e140","Type":"ContainerStarted","Data":"7e429c498fc0f4914500c6d809a65a750f1778d61663094beaa9607086649ce5"} Mar 21 05:08:01 crc kubenswrapper[4775]: I0321 05:08:01.121841 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="cinder-scheduler" containerID="cri-o://4764d465ee215948cf3a07d17ef04460952c808ab216c2d6600d30877448b571" gracePeriod=30 Mar 21 05:08:01 crc kubenswrapper[4775]: I0321 05:08:01.121900 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="probe" containerID="cri-o://0b6150bc28d08d5b96097378ba0b885d4bf49a94a51c13823ebc2ecd1793e28b" gracePeriod=30 Mar 21 05:08:01 crc kubenswrapper[4775]: I0321 05:08:01.651359 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77fd86567d-mf2wb"] Mar 21 05:08:01 crc kubenswrapper[4775]: W0321 05:08:01.675221 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1727040d_36f5_431c_b8f1_84e206146dcf.slice/crio-5f510ceba77ac4c1d3cb0ef8dfe2b2f74dd976f3a7826bebbd6a3e86f98c64b7 WatchSource:0}: Error finding container 5f510ceba77ac4c1d3cb0ef8dfe2b2f74dd976f3a7826bebbd6a3e86f98c64b7: Status 404 returned error can't find the container with id 5f510ceba77ac4c1d3cb0ef8dfe2b2f74dd976f3a7826bebbd6a3e86f98c64b7 Mar 21 05:08:01 crc kubenswrapper[4775]: I0321 05:08:01.722089 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-6klnl"] Mar 21 05:08:01 crc kubenswrapper[4775]: W0321 05:08:01.758095 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeccf102b_f91a_4251_b878_1c21eea92522.slice/crio-486b3020022220622c2fb587ef965adb79954d3ca5c1cadb0f004636ab8a7a5f WatchSource:0}: Error finding container 486b3020022220622c2fb587ef965adb79954d3ca5c1cadb0f004636ab8a7a5f: Status 404 returned error can't find the container with id 486b3020022220622c2fb587ef965adb79954d3ca5c1cadb0f004636ab8a7a5f Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.133655 4775 generic.go:334] "Generic (PLEG): container finished" podID="3f765864-79ee-415c-99be-55894a20c087" containerID="0b6150bc28d08d5b96097378ba0b885d4bf49a94a51c13823ebc2ecd1793e28b" exitCode=0 Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.133727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f765864-79ee-415c-99be-55894a20c087","Type":"ContainerDied","Data":"0b6150bc28d08d5b96097378ba0b885d4bf49a94a51c13823ebc2ecd1793e28b"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.135443 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-6klnl" event={"ID":"eccf102b-f91a-4251-b878-1c21eea92522","Type":"ContainerStarted","Data":"486b3020022220622c2fb587ef965adb79954d3ca5c1cadb0f004636ab8a7a5f"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.137462 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" event={"ID":"177228f6-7f69-49c2-9942-ea0a98b56d13","Type":"ContainerStarted","Data":"a4f1a7b8252d5d938c9f37a5c854f86c11cfca913cfd03013441dcd3d7e9a4b0"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.137509 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" event={"ID":"177228f6-7f69-49c2-9942-ea0a98b56d13","Type":"ContainerStarted","Data":"290c55a6ed9a3ccda29d74736c7303706a061c58f9cfbcd56bca73fbde4d0be1"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.140129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c16d835a-1ec2-473d-b2d8-c8e7c978e140","Type":"ContainerStarted","Data":"95c44b78dbc399f3bb0547a6982db27b3eb10d48a82b05ae66a24f89138c5a4a"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.140260 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.142403 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77fd86567d-mf2wb" event={"ID":"1727040d-36f5-431c-b8f1-84e206146dcf","Type":"ContainerStarted","Data":"89930651d1c2dbb77788af4c417745e4d0c36ec1ff1f256a634a6fc979de8740"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.142468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77fd86567d-mf2wb" event={"ID":"1727040d-36f5-431c-b8f1-84e206146dcf","Type":"ContainerStarted","Data":"77c5ea4c20c4d779f476066bc96c8a9d0dc23e794f7f3d5e06ca5c68bd151774"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.142487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77fd86567d-mf2wb" event={"ID":"1727040d-36f5-431c-b8f1-84e206146dcf","Type":"ContainerStarted","Data":"5f510ceba77ac4c1d3cb0ef8dfe2b2f74dd976f3a7826bebbd6a3e86f98c64b7"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.142593 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.144662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd9f89d55-fdf8c" event={"ID":"d8a7c2e5-3643-4675-9888-3c310e4f9ad4","Type":"ContainerStarted","Data":"5ada761d9c2f07a9ac6841539f8838edebf3297660a3e8de73a390e936d49876"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.144871 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd9f89d55-fdf8c" event={"ID":"d8a7c2e5-3643-4675-9888-3c310e4f9ad4","Type":"ContainerStarted","Data":"0c9ad89e2e596eca72945de953e3d9cd64eec0394196ab36a4b90dd4fabadeab"} Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.163842 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75f5b547c8-mgjw5" podStartSLOduration=3.422676166 podStartE2EDuration="7.163824363s" podCreationTimestamp="2026-03-21 05:07:55 +0000 UTC" firstStartedPulling="2026-03-21 05:07:57.452286362 +0000 UTC m=+1230.428749986" lastFinishedPulling="2026-03-21 05:08:01.193434559 +0000 UTC m=+1234.169898183" observedRunningTime="2026-03-21 05:08:02.156598549 +0000 UTC m=+1235.133062173" watchObservedRunningTime="2026-03-21 05:08:02.163824363 +0000 UTC m=+1235.140287987" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.183817 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.183789047 podStartE2EDuration="4.183789047s" podCreationTimestamp="2026-03-21 05:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:02.174951387 +0000 UTC m=+1235.151415011" watchObservedRunningTime="2026-03-21 05:08:02.183789047 +0000 UTC m=+1235.160252691" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.196365 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6dd9f89d55-fdf8c" podStartSLOduration=3.247045353 podStartE2EDuration="7.196341632s" podCreationTimestamp="2026-03-21 05:07:55 +0000 UTC" firstStartedPulling="2026-03-21 05:07:57.243894183 +0000 UTC m=+1230.220357807" lastFinishedPulling="2026-03-21 05:08:01.193190452 +0000 UTC m=+1234.169654086" observedRunningTime="2026-03-21 05:08:02.193973235 +0000 UTC m=+1235.170436859" watchObservedRunningTime="2026-03-21 05:08:02.196341632 +0000 UTC m=+1235.172805266" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.481980 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.482307 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.482348 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.483026 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09c71a6e96dc622a58adf5b83a67eab26ff45301d2bfcc2a43f1cbd9eb2d9791"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:08:02 crc kubenswrapper[4775]: I0321 05:08:02.483097 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://09c71a6e96dc622a58adf5b83a67eab26ff45301d2bfcc2a43f1cbd9eb2d9791" gracePeriod=600 Mar 21 05:08:03 crc kubenswrapper[4775]: I0321 05:08:03.173975 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="09c71a6e96dc622a58adf5b83a67eab26ff45301d2bfcc2a43f1cbd9eb2d9791" exitCode=0 Mar 21 05:08:03 crc kubenswrapper[4775]: I0321 05:08:03.174085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"09c71a6e96dc622a58adf5b83a67eab26ff45301d2bfcc2a43f1cbd9eb2d9791"} Mar 21 05:08:03 crc kubenswrapper[4775]: I0321 05:08:03.176375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"b6e85a7c4acc97394b06df812a396284f471ec9c7f9eee22918e9da54e21feda"} Mar 21 05:08:03 crc kubenswrapper[4775]: I0321 05:08:03.176399 4775 scope.go:117] "RemoveContainer" containerID="8d303e1558b2adccb16582693a30248fb7f96ca561d7dcb3104197e825dd15a7" Mar 21 05:08:03 crc kubenswrapper[4775]: I0321 05:08:03.179158 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:03 crc kubenswrapper[4775]: I0321 05:08:03.199592 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77fd86567d-mf2wb" podStartSLOduration=3.199525533 podStartE2EDuration="3.199525533s" podCreationTimestamp="2026-03-21 05:08:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:02.223273253 +0000 UTC m=+1235.199736877" watchObservedRunningTime="2026-03-21 05:08:03.199525533 +0000 UTC m=+1236.175989157" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.190786 4775 generic.go:334] "Generic (PLEG): container finished" podID="3f765864-79ee-415c-99be-55894a20c087" containerID="4764d465ee215948cf3a07d17ef04460952c808ab216c2d6600d30877448b571" exitCode=0 Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.193620 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f765864-79ee-415c-99be-55894a20c087","Type":"ContainerDied","Data":"4764d465ee215948cf3a07d17ef04460952c808ab216c2d6600d30877448b571"} Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.196431 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-6klnl" event={"ID":"eccf102b-f91a-4251-b878-1c21eea92522","Type":"ContainerStarted","Data":"bca1df983777352aa51a410cf20699daccf33fb2338a4057481b7410153afd88"} Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.219199 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567828-6klnl" podStartSLOduration=2.298112147 podStartE2EDuration="4.219180948s" podCreationTimestamp="2026-03-21 05:08:00 +0000 UTC" firstStartedPulling="2026-03-21 05:08:01.760629588 +0000 UTC m=+1234.737093212" lastFinishedPulling="2026-03-21 05:08:03.681698389 +0000 UTC m=+1236.658162013" observedRunningTime="2026-03-21 05:08:04.217036397 +0000 UTC m=+1237.193500021" watchObservedRunningTime="2026-03-21 05:08:04.219180948 +0000 UTC m=+1237.195644582" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.599663 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.681419 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-scripts\") pod \"3f765864-79ee-415c-99be-55894a20c087\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.681477 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data\") pod \"3f765864-79ee-415c-99be-55894a20c087\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.681512 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-combined-ca-bundle\") pod \"3f765864-79ee-415c-99be-55894a20c087\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.681554 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data-custom\") pod \"3f765864-79ee-415c-99be-55894a20c087\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.681964 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f765864-79ee-415c-99be-55894a20c087-etc-machine-id\") pod \"3f765864-79ee-415c-99be-55894a20c087\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.682009 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krwll\" (UniqueName: \"kubernetes.io/projected/3f765864-79ee-415c-99be-55894a20c087-kube-api-access-krwll\") pod \"3f765864-79ee-415c-99be-55894a20c087\" (UID: \"3f765864-79ee-415c-99be-55894a20c087\") " Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.687765 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f765864-79ee-415c-99be-55894a20c087-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3f765864-79ee-415c-99be-55894a20c087" (UID: "3f765864-79ee-415c-99be-55894a20c087"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.690363 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3f765864-79ee-415c-99be-55894a20c087" (UID: "3f765864-79ee-415c-99be-55894a20c087"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.690666 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f765864-79ee-415c-99be-55894a20c087-kube-api-access-krwll" (OuterVolumeSpecName: "kube-api-access-krwll") pod "3f765864-79ee-415c-99be-55894a20c087" (UID: "3f765864-79ee-415c-99be-55894a20c087"). InnerVolumeSpecName "kube-api-access-krwll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.692936 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-scripts" (OuterVolumeSpecName: "scripts") pod "3f765864-79ee-415c-99be-55894a20c087" (UID: "3f765864-79ee-415c-99be-55894a20c087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.762955 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f765864-79ee-415c-99be-55894a20c087" (UID: "3f765864-79ee-415c-99be-55894a20c087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.783976 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f765864-79ee-415c-99be-55894a20c087-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.784013 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krwll\" (UniqueName: \"kubernetes.io/projected/3f765864-79ee-415c-99be-55894a20c087-kube-api-access-krwll\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.784028 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.784040 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.784052 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.796485 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data" (OuterVolumeSpecName: "config-data") pod "3f765864-79ee-415c-99be-55894a20c087" (UID: "3f765864-79ee-415c-99be-55894a20c087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:04 crc kubenswrapper[4775]: I0321 05:08:04.886013 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f765864-79ee-415c-99be-55894a20c087-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.224824 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3f765864-79ee-415c-99be-55894a20c087","Type":"ContainerDied","Data":"85e53e5965bf199e0c4123a3954f4e5d437587f60519d3c96ac82b5f50bb3679"} Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.224863 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.225364 4775 scope.go:117] "RemoveContainer" containerID="0b6150bc28d08d5b96097378ba0b885d4bf49a94a51c13823ebc2ecd1793e28b" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.230212 4775 generic.go:334] "Generic (PLEG): container finished" podID="eccf102b-f91a-4251-b878-1c21eea92522" containerID="bca1df983777352aa51a410cf20699daccf33fb2338a4057481b7410153afd88" exitCode=0 Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.230276 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-6klnl" event={"ID":"eccf102b-f91a-4251-b878-1c21eea92522","Type":"ContainerDied","Data":"bca1df983777352aa51a410cf20699daccf33fb2338a4057481b7410153afd88"} Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.262828 4775 scope.go:117] "RemoveContainer" containerID="4764d465ee215948cf3a07d17ef04460952c808ab216c2d6600d30877448b571" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.275763 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.295855 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.308973 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:08:05 crc kubenswrapper[4775]: E0321 05:08:05.309426 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="probe" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.309439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="probe" Mar 21 05:08:05 crc kubenswrapper[4775]: E0321 05:08:05.309465 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="cinder-scheduler" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.309472 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="cinder-scheduler" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.309633 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="cinder-scheduler" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.309660 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f765864-79ee-415c-99be-55894a20c087" containerName="probe" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.310617 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.312743 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.338015 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.395975 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-config-data\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.396017 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqx6\" (UniqueName: \"kubernetes.io/projected/d02bb319-292e-449c-8e5b-42c6859f1529-kube-api-access-vsqx6\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.396357 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.396442 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-scripts\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.396476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d02bb319-292e-449c-8e5b-42c6859f1529-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.396541 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.399307 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:57804->10.217.0.147:8443: read: connection reset by peer" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.399907 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.499307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.499433 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-config-data\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.499463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqx6\" (UniqueName: \"kubernetes.io/projected/d02bb319-292e-449c-8e5b-42c6859f1529-kube-api-access-vsqx6\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.499591 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.499618 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-scripts\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.499640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d02bb319-292e-449c-8e5b-42c6859f1529-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.499750 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d02bb319-292e-449c-8e5b-42c6859f1529-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.505169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.508169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-config-data\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.508532 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.510517 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d02bb319-292e-449c-8e5b-42c6859f1529-scripts\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.520771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqx6\" (UniqueName: \"kubernetes.io/projected/d02bb319-292e-449c-8e5b-42c6859f1529-kube-api-access-vsqx6\") pod \"cinder-scheduler-0\" (UID: \"d02bb319-292e-449c-8e5b-42c6859f1529\") " pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.680692 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 05:08:05 crc kubenswrapper[4775]: I0321 05:08:05.682891 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f765864-79ee-415c-99be-55894a20c087" path="/var/lib/kubelet/pods/3f765864-79ee-415c-99be-55894a20c087/volumes" Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.175144 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.249735 4775 generic.go:334] "Generic (PLEG): container finished" podID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerID="1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa" exitCode=0 Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.249826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64fb567758-hd2ld" event={"ID":"09cf3763-0b41-4452-a247-d9a56f58b05d","Type":"ContainerDied","Data":"1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa"} Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.256364 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d02bb319-292e-449c-8e5b-42c6859f1529","Type":"ContainerStarted","Data":"617448a041c2ffe276be923a8975488baf07292190b1d4d69545558da5c6c9c9"} Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.743210 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-6klnl" Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.829436 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt4qg\" (UniqueName: \"kubernetes.io/projected/eccf102b-f91a-4251-b878-1c21eea92522-kube-api-access-gt4qg\") pod \"eccf102b-f91a-4251-b878-1c21eea92522\" (UID: \"eccf102b-f91a-4251-b878-1c21eea92522\") " Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.843435 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.875273 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eccf102b-f91a-4251-b878-1c21eea92522-kube-api-access-gt4qg" (OuterVolumeSpecName: "kube-api-access-gt4qg") pod "eccf102b-f91a-4251-b878-1c21eea92522" (UID: "eccf102b-f91a-4251-b878-1c21eea92522"). InnerVolumeSpecName "kube-api-access-gt4qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.919342 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-bvxtx"] Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.919644 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" podUID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerName="dnsmasq-dns" containerID="cri-o://18c26c2a6ec8c71c7881c713a6d2fb65eaa2258d47fd0f272a51e57881466532" gracePeriod=10 Mar 21 05:08:06 crc kubenswrapper[4775]: I0321 05:08:06.934503 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt4qg\" (UniqueName: \"kubernetes.io/projected/eccf102b-f91a-4251-b878-1c21eea92522-kube-api-access-gt4qg\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.290088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d02bb319-292e-449c-8e5b-42c6859f1529","Type":"ContainerStarted","Data":"f1a65482fa1320beaf95005d562b733e9a1acd6ed031fa590fa9340cdc95cc53"} Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.294560 4775 generic.go:334] "Generic (PLEG): container finished" podID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerID="18c26c2a6ec8c71c7881c713a6d2fb65eaa2258d47fd0f272a51e57881466532" exitCode=0 Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.294634 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" event={"ID":"f65ac8a3-4328-41af-8854-e8a2a9fce295","Type":"ContainerDied","Data":"18c26c2a6ec8c71c7881c713a6d2fb65eaa2258d47fd0f272a51e57881466532"} Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.305468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-6klnl" event={"ID":"eccf102b-f91a-4251-b878-1c21eea92522","Type":"ContainerDied","Data":"486b3020022220622c2fb587ef965adb79954d3ca5c1cadb0f004636ab8a7a5f"} Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.305553 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486b3020022220622c2fb587ef965adb79954d3ca5c1cadb0f004636ab8a7a5f" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.305626 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-6klnl" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.355199 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-pqlrw"] Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.386395 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-pqlrw"] Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.687502 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.690032 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f791f730-ffa2-44d3-b151-dc9d6e6e1743" path="/var/lib/kubelet/pods/f791f730-ffa2-44d3-b151-dc9d6e6e1743/volumes" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.772611 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-svc\") pod \"f65ac8a3-4328-41af-8854-e8a2a9fce295\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.772834 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-sb\") pod \"f65ac8a3-4328-41af-8854-e8a2a9fce295\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.772911 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vtjl\" (UniqueName: \"kubernetes.io/projected/f65ac8a3-4328-41af-8854-e8a2a9fce295-kube-api-access-6vtjl\") pod \"f65ac8a3-4328-41af-8854-e8a2a9fce295\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.772930 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-swift-storage-0\") pod \"f65ac8a3-4328-41af-8854-e8a2a9fce295\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.772962 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-config\") pod \"f65ac8a3-4328-41af-8854-e8a2a9fce295\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.773126 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-nb\") pod \"f65ac8a3-4328-41af-8854-e8a2a9fce295\" (UID: \"f65ac8a3-4328-41af-8854-e8a2a9fce295\") " Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.783323 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65ac8a3-4328-41af-8854-e8a2a9fce295-kube-api-access-6vtjl" (OuterVolumeSpecName: "kube-api-access-6vtjl") pod "f65ac8a3-4328-41af-8854-e8a2a9fce295" (UID: "f65ac8a3-4328-41af-8854-e8a2a9fce295"). InnerVolumeSpecName "kube-api-access-6vtjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.838844 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f65ac8a3-4328-41af-8854-e8a2a9fce295" (UID: "f65ac8a3-4328-41af-8854-e8a2a9fce295"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.857269 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f65ac8a3-4328-41af-8854-e8a2a9fce295" (UID: "f65ac8a3-4328-41af-8854-e8a2a9fce295"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.871657 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-config" (OuterVolumeSpecName: "config") pod "f65ac8a3-4328-41af-8854-e8a2a9fce295" (UID: "f65ac8a3-4328-41af-8854-e8a2a9fce295"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.875345 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.875377 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vtjl\" (UniqueName: \"kubernetes.io/projected/f65ac8a3-4328-41af-8854-e8a2a9fce295-kube-api-access-6vtjl\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.875389 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.875397 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.914315 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f65ac8a3-4328-41af-8854-e8a2a9fce295" (UID: "f65ac8a3-4328-41af-8854-e8a2a9fce295"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.948348 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f65ac8a3-4328-41af-8854-e8a2a9fce295" (UID: "f65ac8a3-4328-41af-8854-e8a2a9fce295"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.976537 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:07 crc kubenswrapper[4775]: I0321 05:08:07.976572 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65ac8a3-4328-41af-8854-e8a2a9fce295-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.145180 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-66c879cfdd-smnxp" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.319342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d02bb319-292e-449c-8e5b-42c6859f1529","Type":"ContainerStarted","Data":"7218f41fdb3215097835d065eddc8dc9dc7b371f024366fe1ef833bd0e95e62d"} Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.326652 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" event={"ID":"f65ac8a3-4328-41af-8854-e8a2a9fce295","Type":"ContainerDied","Data":"b2b7a3537eba6e1449d61c2042541c00aac4345e771c671fe8c458e056f81a53"} Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.326693 4775 scope.go:117] "RemoveContainer" containerID="18c26c2a6ec8c71c7881c713a6d2fb65eaa2258d47fd0f272a51e57881466532" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.326799 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-bvxtx" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.343875 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.343857774 podStartE2EDuration="3.343857774s" podCreationTimestamp="2026-03-21 05:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:08.333753198 +0000 UTC m=+1241.310216832" watchObservedRunningTime="2026-03-21 05:08:08.343857774 +0000 UTC m=+1241.320321398" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.359398 4775 scope.go:117] "RemoveContainer" containerID="73687f66a6e199eea3343a248d8eb3f006e4d6f0bdcb64641db9e3b7a2d85758" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.371647 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-bvxtx"] Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.413856 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-bvxtx"] Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.859316 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 21 05:08:08 crc kubenswrapper[4775]: E0321 05:08:08.859953 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerName="init" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.859976 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerName="init" Mar 21 05:08:08 crc kubenswrapper[4775]: E0321 05:08:08.859988 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerName="dnsmasq-dns" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.859997 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerName="dnsmasq-dns" Mar 21 05:08:08 crc kubenswrapper[4775]: E0321 05:08:08.860041 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eccf102b-f91a-4251-b878-1c21eea92522" containerName="oc" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.860050 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="eccf102b-f91a-4251-b878-1c21eea92522" containerName="oc" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.860280 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="eccf102b-f91a-4251-b878-1c21eea92522" containerName="oc" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.860305 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f65ac8a3-4328-41af-8854-e8a2a9fce295" containerName="dnsmasq-dns" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.860993 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.864184 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6nwpr" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.864206 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.878741 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.882904 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 05:08:08 crc kubenswrapper[4775]: I0321 05:08:08.932553 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.009099 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.009194 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnqx\" (UniqueName: \"kubernetes.io/projected/dfe744b5-63ab-4df2-89cd-05605cfa4111-kube-api-access-bbnqx\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.009252 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config-secret\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.009328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-combined-ca-bundle\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.111399 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-combined-ca-bundle\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.111803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.111917 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbnqx\" (UniqueName: \"kubernetes.io/projected/dfe744b5-63ab-4df2-89cd-05605cfa4111-kube-api-access-bbnqx\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.112010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config-secret\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.112784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.117719 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config-secret\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.118235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-combined-ca-bundle\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.127169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbnqx\" (UniqueName: \"kubernetes.io/projected/dfe744b5-63ab-4df2-89cd-05605cfa4111-kube-api-access-bbnqx\") pod \"openstackclient\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.160733 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.161384 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.172675 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.216943 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.221417 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.234865 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.321159 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5a0456-b5c5-433a-afde-fe38740e2310-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.321564 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bb5a0456-b5c5-433a-afde-fe38740e2310-openstack-config-secret\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.321609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bb5a0456-b5c5-433a-afde-fe38740e2310-openstack-config\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.321661 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knd2b\" (UniqueName: \"kubernetes.io/projected/bb5a0456-b5c5-433a-afde-fe38740e2310-kube-api-access-knd2b\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: E0321 05:08:09.369985 4775 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 21 05:08:09 crc kubenswrapper[4775]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_dfe744b5-63ab-4df2-89cd-05605cfa4111_0(f8365ba4a6b3a7f175026f4a4a5c25207ab22c926e92448937608c75f2fd4f1b): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f8365ba4a6b3a7f175026f4a4a5c25207ab22c926e92448937608c75f2fd4f1b" Netns:"/var/run/netns/7b7609de-10ad-42d6-b5d7-23f5745f15d5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f8365ba4a6b3a7f175026f4a4a5c25207ab22c926e92448937608c75f2fd4f1b;K8S_POD_UID=dfe744b5-63ab-4df2-89cd-05605cfa4111" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/dfe744b5-63ab-4df2-89cd-05605cfa4111]: expected pod UID "dfe744b5-63ab-4df2-89cd-05605cfa4111" but got "bb5a0456-b5c5-433a-afde-fe38740e2310" from Kube API Mar 21 05:08:09 crc kubenswrapper[4775]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 21 05:08:09 crc kubenswrapper[4775]: > Mar 21 05:08:09 crc kubenswrapper[4775]: E0321 05:08:09.370069 4775 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 21 05:08:09 crc kubenswrapper[4775]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_dfe744b5-63ab-4df2-89cd-05605cfa4111_0(f8365ba4a6b3a7f175026f4a4a5c25207ab22c926e92448937608c75f2fd4f1b): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f8365ba4a6b3a7f175026f4a4a5c25207ab22c926e92448937608c75f2fd4f1b" Netns:"/var/run/netns/7b7609de-10ad-42d6-b5d7-23f5745f15d5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f8365ba4a6b3a7f175026f4a4a5c25207ab22c926e92448937608c75f2fd4f1b;K8S_POD_UID=dfe744b5-63ab-4df2-89cd-05605cfa4111" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/dfe744b5-63ab-4df2-89cd-05605cfa4111]: expected pod UID "dfe744b5-63ab-4df2-89cd-05605cfa4111" but got "bb5a0456-b5c5-433a-afde-fe38740e2310" from Kube API Mar 21 05:08:09 crc kubenswrapper[4775]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 21 05:08:09 crc kubenswrapper[4775]: > pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.423535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bb5a0456-b5c5-433a-afde-fe38740e2310-openstack-config-secret\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.423587 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bb5a0456-b5c5-433a-afde-fe38740e2310-openstack-config\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.423628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knd2b\" (UniqueName: \"kubernetes.io/projected/bb5a0456-b5c5-433a-afde-fe38740e2310-kube-api-access-knd2b\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.423731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5a0456-b5c5-433a-afde-fe38740e2310-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.424646 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bb5a0456-b5c5-433a-afde-fe38740e2310-openstack-config\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.430182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5a0456-b5c5-433a-afde-fe38740e2310-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.430212 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bb5a0456-b5c5-433a-afde-fe38740e2310-openstack-config-secret\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.442712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knd2b\" (UniqueName: \"kubernetes.io/projected/bb5a0456-b5c5-433a-afde-fe38740e2310-kube-api-access-knd2b\") pod \"openstackclient\" (UID: \"bb5a0456-b5c5-433a-afde-fe38740e2310\") " pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.542841 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.661351 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:08:09 crc kubenswrapper[4775]: I0321 05:08:09.678230 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65ac8a3-4328-41af-8854-e8a2a9fce295" path="/var/lib/kubelet/pods/f65ac8a3-4328-41af-8854-e8a2a9fce295/volumes" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.039073 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 05:08:10 crc kubenswrapper[4775]: W0321 05:08:10.045934 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb5a0456_b5c5_433a_afde_fe38740e2310.slice/crio-fa9018ccabaf8b89d5be7fae7d1f288a596ea8f63313f06d19d0fbac3f5ae1f1 WatchSource:0}: Error finding container fa9018ccabaf8b89d5be7fae7d1f288a596ea8f63313f06d19d0fbac3f5ae1f1: Status 404 returned error can't find the container with id fa9018ccabaf8b89d5be7fae7d1f288a596ea8f63313f06d19d0fbac3f5ae1f1 Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.358594 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bb5a0456-b5c5-433a-afde-fe38740e2310","Type":"ContainerStarted","Data":"fa9018ccabaf8b89d5be7fae7d1f288a596ea8f63313f06d19d0fbac3f5ae1f1"} Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.358604 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.368890 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.373561 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="dfe744b5-63ab-4df2-89cd-05605cfa4111" podUID="bb5a0456-b5c5-433a-afde-fe38740e2310" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.542615 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config-secret\") pod \"dfe744b5-63ab-4df2-89cd-05605cfa4111\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.542795 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config\") pod \"dfe744b5-63ab-4df2-89cd-05605cfa4111\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.542826 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-combined-ca-bundle\") pod \"dfe744b5-63ab-4df2-89cd-05605cfa4111\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.542909 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbnqx\" (UniqueName: \"kubernetes.io/projected/dfe744b5-63ab-4df2-89cd-05605cfa4111-kube-api-access-bbnqx\") pod \"dfe744b5-63ab-4df2-89cd-05605cfa4111\" (UID: \"dfe744b5-63ab-4df2-89cd-05605cfa4111\") " Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.543937 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "dfe744b5-63ab-4df2-89cd-05605cfa4111" (UID: "dfe744b5-63ab-4df2-89cd-05605cfa4111"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.548330 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe744b5-63ab-4df2-89cd-05605cfa4111-kube-api-access-bbnqx" (OuterVolumeSpecName: "kube-api-access-bbnqx") pod "dfe744b5-63ab-4df2-89cd-05605cfa4111" (UID: "dfe744b5-63ab-4df2-89cd-05605cfa4111"). InnerVolumeSpecName "kube-api-access-bbnqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.560938 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfe744b5-63ab-4df2-89cd-05605cfa4111" (UID: "dfe744b5-63ab-4df2-89cd-05605cfa4111"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.562224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "dfe744b5-63ab-4df2-89cd-05605cfa4111" (UID: "dfe744b5-63ab-4df2-89cd-05605cfa4111"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.644945 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.644983 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbnqx\" (UniqueName: \"kubernetes.io/projected/dfe744b5-63ab-4df2-89cd-05605cfa4111-kube-api-access-bbnqx\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.644998 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.645009 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dfe744b5-63ab-4df2-89cd-05605cfa4111-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:10 crc kubenswrapper[4775]: I0321 05:08:10.681517 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 05:08:11 crc kubenswrapper[4775]: I0321 05:08:11.374537 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 05:08:11 crc kubenswrapper[4775]: I0321 05:08:11.412953 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 21 05:08:11 crc kubenswrapper[4775]: I0321 05:08:11.448637 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="dfe744b5-63ab-4df2-89cd-05605cfa4111" podUID="bb5a0456-b5c5-433a-afde-fe38740e2310" Mar 21 05:08:11 crc kubenswrapper[4775]: I0321 05:08:11.686093 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe744b5-63ab-4df2-89cd-05605cfa4111" path="/var/lib/kubelet/pods/dfe744b5-63ab-4df2-89cd-05605cfa4111/volumes" Mar 21 05:08:12 crc kubenswrapper[4775]: I0321 05:08:12.538935 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:12 crc kubenswrapper[4775]: I0321 05:08:12.843801 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77fd86567d-mf2wb" Mar 21 05:08:12 crc kubenswrapper[4775]: I0321 05:08:12.913251 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dc8b6bf8b-btcqr"] Mar 21 05:08:12 crc kubenswrapper[4775]: I0321 05:08:12.913702 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api-log" containerID="cri-o://da40033a5dd52e534fbf0e897d41a379c55c497ad81da6c097f013d7acf158d3" gracePeriod=30 Mar 21 05:08:12 crc kubenswrapper[4775]: I0321 05:08:12.914126 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api" containerID="cri-o://9e09a5090c4d61aabf6c91b2097f27d3892ae005a5644d72b8acf7ad49174027" gracePeriod=30 Mar 21 05:08:12 crc kubenswrapper[4775]: I0321 05:08:12.926642 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 21 05:08:13 crc kubenswrapper[4775]: I0321 05:08:13.422988 4775 generic.go:334] "Generic (PLEG): container finished" podID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerID="da40033a5dd52e534fbf0e897d41a379c55c497ad81da6c097f013d7acf158d3" exitCode=143 Mar 21 05:08:13 crc kubenswrapper[4775]: I0321 05:08:13.423043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" event={"ID":"86abc8a1-1d5e-4dad-9e62-4f1c88608b95","Type":"ContainerDied","Data":"da40033a5dd52e534fbf0e897d41a379c55c497ad81da6c097f013d7acf158d3"} Mar 21 05:08:14 crc kubenswrapper[4775]: I0321 05:08:14.951369 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-f9b88fb79-vclnv"] Mar 21 05:08:14 crc kubenswrapper[4775]: I0321 05:08:14.957829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:14 crc kubenswrapper[4775]: I0321 05:08:14.960540 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 05:08:14 crc kubenswrapper[4775]: I0321 05:08:14.960565 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f9b88fb79-vclnv"] Mar 21 05:08:14 crc kubenswrapper[4775]: I0321 05:08:14.960703 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 21 05:08:14 crc kubenswrapper[4775]: I0321 05:08:14.960706 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 21 05:08:14 crc kubenswrapper[4775]: I0321 05:08:14.979464 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.082772 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfjng\" (UniqueName: \"kubernetes.io/projected/77432545-f22c-453a-b6a7-7c932712efa9-kube-api-access-mfjng\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.082823 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77432545-f22c-453a-b6a7-7c932712efa9-etc-swift\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.082914 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-combined-ca-bundle\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.083000 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-public-tls-certs\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.083076 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-config-data\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.083361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77432545-f22c-453a-b6a7-7c932712efa9-run-httpd\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.083454 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-internal-tls-certs\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.083527 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77432545-f22c-453a-b6a7-7c932712efa9-log-httpd\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.184995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfjng\" (UniqueName: \"kubernetes.io/projected/77432545-f22c-453a-b6a7-7c932712efa9-kube-api-access-mfjng\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185046 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77432545-f22c-453a-b6a7-7c932712efa9-etc-swift\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185103 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-combined-ca-bundle\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-public-tls-certs\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-config-data\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185224 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77432545-f22c-453a-b6a7-7c932712efa9-run-httpd\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-internal-tls-certs\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77432545-f22c-453a-b6a7-7c932712efa9-log-httpd\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.185760 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77432545-f22c-453a-b6a7-7c932712efa9-log-httpd\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.187937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77432545-f22c-453a-b6a7-7c932712efa9-run-httpd\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.192182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-combined-ca-bundle\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.193019 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-config-data\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.192982 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-internal-tls-certs\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.194084 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77432545-f22c-453a-b6a7-7c932712efa9-etc-swift\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.208915 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77432545-f22c-453a-b6a7-7c932712efa9-public-tls-certs\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.210978 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfjng\" (UniqueName: \"kubernetes.io/projected/77432545-f22c-453a-b6a7-7c932712efa9-kube-api-access-mfjng\") pod \"swift-proxy-f9b88fb79-vclnv\" (UID: \"77432545-f22c-453a-b6a7-7c932712efa9\") " pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.295350 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:15 crc kubenswrapper[4775]: I0321 05:08:15.945341 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f9b88fb79-vclnv"] Mar 21 05:08:15 crc kubenswrapper[4775]: W0321 05:08:15.958393 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77432545_f22c_453a_b6a7_7c932712efa9.slice/crio-bcdf50870f46b4a992b7a1625618f6249d5ebc5618940c1849f89429598adbec WatchSource:0}: Error finding container bcdf50870f46b4a992b7a1625618f6249d5ebc5618940c1849f89429598adbec: Status 404 returned error can't find the container with id bcdf50870f46b4a992b7a1625618f6249d5ebc5618940c1849f89429598adbec Mar 21 05:08:16 crc kubenswrapper[4775]: I0321 05:08:16.283025 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 05:08:16 crc kubenswrapper[4775]: I0321 05:08:16.472902 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f9b88fb79-vclnv" event={"ID":"77432545-f22c-453a-b6a7-7c932712efa9","Type":"ContainerStarted","Data":"bcdf50870f46b4a992b7a1625618f6249d5ebc5618940c1849f89429598adbec"} Mar 21 05:08:16 crc kubenswrapper[4775]: I0321 05:08:16.483669 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 21 05:08:16 crc kubenswrapper[4775]: I0321 05:08:16.976522 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:08:16 crc kubenswrapper[4775]: I0321 05:08:16.990440 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.182520 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-tmm94"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.189407 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.194043 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tmm94"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.328263 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-operator-scripts\") pod \"nova-api-db-create-tmm94\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.328398 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkh9x\" (UniqueName: \"kubernetes.io/projected/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-kube-api-access-xkh9x\") pod \"nova-api-db-create-tmm94\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.382264 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-wxmkg"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.383753 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.390172 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a38c-account-create-update-x226p"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.391622 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.394557 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.397878 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wxmkg"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.405219 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a38c-account-create-update-x226p"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.418558 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:58266->10.217.0.166:9311: read: connection reset by peer" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.418597 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:58256->10.217.0.166:9311: read: connection reset by peer" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.432510 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ed673-344a-4064-ae90-c9d20964c648-operator-scripts\") pod \"nova-cell0-db-create-wxmkg\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.432570 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-operator-scripts\") pod \"nova-api-db-create-tmm94\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.432600 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6wb\" (UniqueName: \"kubernetes.io/projected/b65ed673-344a-4064-ae90-c9d20964c648-kube-api-access-wb6wb\") pod \"nova-cell0-db-create-wxmkg\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.432709 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkh9x\" (UniqueName: \"kubernetes.io/projected/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-kube-api-access-xkh9x\") pod \"nova-api-db-create-tmm94\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.433916 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-operator-scripts\") pod \"nova-api-db-create-tmm94\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.451575 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkh9x\" (UniqueName: \"kubernetes.io/projected/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-kube-api-access-xkh9x\") pod \"nova-api-db-create-tmm94\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.493247 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zfz4w"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.494551 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.514407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f9b88fb79-vclnv" event={"ID":"77432545-f22c-453a-b6a7-7c932712efa9","Type":"ContainerStarted","Data":"9eedba89840a3fd35f983095f0bd1a2e6c833d019237a66f24e8a6d46f58b2aa"} Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.529345 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zfz4w"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.529819 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.531744 4775 generic.go:334] "Generic (PLEG): container finished" podID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerID="9e09a5090c4d61aabf6c91b2097f27d3892ae005a5644d72b8acf7ad49174027" exitCode=0 Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.532183 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" event={"ID":"86abc8a1-1d5e-4dad-9e62-4f1c88608b95","Type":"ContainerDied","Data":"9e09a5090c4d61aabf6c91b2097f27d3892ae005a5644d72b8acf7ad49174027"} Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.537583 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/028d46f7-6f14-40f6-a6cf-77aeec59a99e-operator-scripts\") pod \"nova-api-a38c-account-create-update-x226p\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.539656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbnj\" (UniqueName: \"kubernetes.io/projected/028d46f7-6f14-40f6-a6cf-77aeec59a99e-kube-api-access-jcbnj\") pod \"nova-api-a38c-account-create-update-x226p\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.539709 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ed673-344a-4064-ae90-c9d20964c648-operator-scripts\") pod \"nova-cell0-db-create-wxmkg\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.539747 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6wb\" (UniqueName: \"kubernetes.io/projected/b65ed673-344a-4064-ae90-c9d20964c648-kube-api-access-wb6wb\") pod \"nova-cell0-db-create-wxmkg\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.540585 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ed673-344a-4064-ae90-c9d20964c648-operator-scripts\") pod \"nova-cell0-db-create-wxmkg\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.558631 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6wb\" (UniqueName: \"kubernetes.io/projected/b65ed673-344a-4064-ae90-c9d20964c648-kube-api-access-wb6wb\") pod \"nova-cell0-db-create-wxmkg\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.586842 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db3c-account-create-update-k29zp"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.588853 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.591896 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.612581 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db3c-account-create-update-k29zp"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.641948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e82da2-fdd4-4ead-8a9b-022e55a03690-operator-scripts\") pod \"nova-cell1-db-create-zfz4w\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.642052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/028d46f7-6f14-40f6-a6cf-77aeec59a99e-operator-scripts\") pod \"nova-api-a38c-account-create-update-x226p\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.642219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a5e145-5b67-4d02-9104-ade3e48888db-operator-scripts\") pod \"nova-cell0-db3c-account-create-update-k29zp\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.642250 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqfwp\" (UniqueName: \"kubernetes.io/projected/22a5e145-5b67-4d02-9104-ade3e48888db-kube-api-access-lqfwp\") pod \"nova-cell0-db3c-account-create-update-k29zp\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.642313 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txzb7\" (UniqueName: \"kubernetes.io/projected/87e82da2-fdd4-4ead-8a9b-022e55a03690-kube-api-access-txzb7\") pod \"nova-cell1-db-create-zfz4w\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.642529 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbnj\" (UniqueName: \"kubernetes.io/projected/028d46f7-6f14-40f6-a6cf-77aeec59a99e-kube-api-access-jcbnj\") pod \"nova-api-a38c-account-create-update-x226p\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.643173 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/028d46f7-6f14-40f6-a6cf-77aeec59a99e-operator-scripts\") pod \"nova-api-a38c-account-create-update-x226p\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.666500 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbnj\" (UniqueName: \"kubernetes.io/projected/028d46f7-6f14-40f6-a6cf-77aeec59a99e-kube-api-access-jcbnj\") pod \"nova-api-a38c-account-create-update-x226p\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.699264 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.723945 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.744472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a5e145-5b67-4d02-9104-ade3e48888db-operator-scripts\") pod \"nova-cell0-db3c-account-create-update-k29zp\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.744528 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqfwp\" (UniqueName: \"kubernetes.io/projected/22a5e145-5b67-4d02-9104-ade3e48888db-kube-api-access-lqfwp\") pod \"nova-cell0-db3c-account-create-update-k29zp\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.744558 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txzb7\" (UniqueName: \"kubernetes.io/projected/87e82da2-fdd4-4ead-8a9b-022e55a03690-kube-api-access-txzb7\") pod \"nova-cell1-db-create-zfz4w\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.744703 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e82da2-fdd4-4ead-8a9b-022e55a03690-operator-scripts\") pod \"nova-cell1-db-create-zfz4w\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.745413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a5e145-5b67-4d02-9104-ade3e48888db-operator-scripts\") pod \"nova-cell0-db3c-account-create-update-k29zp\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.745467 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e82da2-fdd4-4ead-8a9b-022e55a03690-operator-scripts\") pod \"nova-cell1-db-create-zfz4w\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.770890 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txzb7\" (UniqueName: \"kubernetes.io/projected/87e82da2-fdd4-4ead-8a9b-022e55a03690-kube-api-access-txzb7\") pod \"nova-cell1-db-create-zfz4w\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.775005 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqfwp\" (UniqueName: \"kubernetes.io/projected/22a5e145-5b67-4d02-9104-ade3e48888db-kube-api-access-lqfwp\") pod \"nova-cell0-db3c-account-create-update-k29zp\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.805442 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-45fb-account-create-update-glct2"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.807241 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.810920 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.827254 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-45fb-account-create-update-glct2"] Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.836530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.921733 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.949501 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330e8d60-7af3-42ed-a0d7-234264037d09-operator-scripts\") pod \"nova-cell1-45fb-account-create-update-glct2\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:17 crc kubenswrapper[4775]: I0321 05:08:17.949570 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47c7f\" (UniqueName: \"kubernetes.io/projected/330e8d60-7af3-42ed-a0d7-234264037d09-kube-api-access-47c7f\") pod \"nova-cell1-45fb-account-create-update-glct2\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:18 crc kubenswrapper[4775]: I0321 05:08:18.051954 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330e8d60-7af3-42ed-a0d7-234264037d09-operator-scripts\") pod \"nova-cell1-45fb-account-create-update-glct2\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:18 crc kubenswrapper[4775]: I0321 05:08:18.052014 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47c7f\" (UniqueName: \"kubernetes.io/projected/330e8d60-7af3-42ed-a0d7-234264037d09-kube-api-access-47c7f\") pod \"nova-cell1-45fb-account-create-update-glct2\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:18 crc kubenswrapper[4775]: I0321 05:08:18.052698 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330e8d60-7af3-42ed-a0d7-234264037d09-operator-scripts\") pod \"nova-cell1-45fb-account-create-update-glct2\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:18 crc kubenswrapper[4775]: I0321 05:08:18.070986 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47c7f\" (UniqueName: \"kubernetes.io/projected/330e8d60-7af3-42ed-a0d7-234264037d09-kube-api-access-47c7f\") pod \"nova-cell1-45fb-account-create-update-glct2\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:18 crc kubenswrapper[4775]: I0321 05:08:18.181757 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:19 crc kubenswrapper[4775]: I0321 05:08:19.500200 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:08:19 crc kubenswrapper[4775]: I0321 05:08:19.555680 4775 generic.go:334] "Generic (PLEG): container finished" podID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerID="507f523f41fa20fba933e438c92908136440aa69787cc4ade1ab3f5729729a60" exitCode=137 Mar 21 05:08:19 crc kubenswrapper[4775]: I0321 05:08:19.555733 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerDied","Data":"507f523f41fa20fba933e438c92908136440aa69787cc4ade1ab3f5729729a60"} Mar 21 05:08:20 crc kubenswrapper[4775]: I0321 05:08:20.990444 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:08:20 crc kubenswrapper[4775]: I0321 05:08:20.993088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58948d8bb4-rcw89" Mar 21 05:08:21 crc kubenswrapper[4775]: I0321 05:08:21.852036 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 21 05:08:21 crc kubenswrapper[4775]: I0321 05:08:21.852053 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Mar 21 05:08:24 crc kubenswrapper[4775]: I0321 05:08:24.980004 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64fb567758-hd2ld" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.478525 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.593711 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.651671 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ac5c16c-56ac-4299-ae61-a8200986ce10","Type":"ContainerDied","Data":"25ef4b17ffe6969bdd8510a4d651e67244e698c334397d84b7e113488d96c6fb"} Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.651714 4775 scope.go:117] "RemoveContainer" containerID="507f523f41fa20fba933e438c92908136440aa69787cc4ade1ab3f5729729a60" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.651823 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.661172 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.678460 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bzmg\" (UniqueName: \"kubernetes.io/projected/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-kube-api-access-6bzmg\") pod \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.678735 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data\") pod \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.678813 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-combined-ca-bundle\") pod \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.678921 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-logs\") pod \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.679052 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data-custom\") pod \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\" (UID: \"86abc8a1-1d5e-4dad-9e62-4f1c88608b95\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.682596 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-logs" (OuterVolumeSpecName: "logs") pod "86abc8a1-1d5e-4dad-9e62-4f1c88608b95" (UID: "86abc8a1-1d5e-4dad-9e62-4f1c88608b95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.693521 4775 scope.go:117] "RemoveContainer" containerID="3ffce781b6baa0a2f3ae718a0cd398633d05ff48719dba863adc5152b47fe849" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.696289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86abc8a1-1d5e-4dad-9e62-4f1c88608b95" (UID: "86abc8a1-1d5e-4dad-9e62-4f1c88608b95"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.701104 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-kube-api-access-6bzmg" (OuterVolumeSpecName: "kube-api-access-6bzmg") pod "86abc8a1-1d5e-4dad-9e62-4f1c88608b95" (UID: "86abc8a1-1d5e-4dad-9e62-4f1c88608b95"). InnerVolumeSpecName "kube-api-access-6bzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.734526 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86abc8a1-1d5e-4dad-9e62-4f1c88608b95" (UID: "86abc8a1-1d5e-4dad-9e62-4f1c88608b95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.742390 4775 scope.go:117] "RemoveContainer" containerID="4e1e7c557d6fa58625eeced787347009a65ed5b8a3da8a6487f343ffff4e6c90" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.759946 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc8b6bf8b-btcqr" event={"ID":"86abc8a1-1d5e-4dad-9e62-4f1c88608b95","Type":"ContainerDied","Data":"56894b1d2f57fb4749d6394cfec85a58eb1ab9e8c5e0205ea22bb5116fe8c5c9"} Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.778829 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data" (OuterVolumeSpecName: "config-data") pod "86abc8a1-1d5e-4dad-9e62-4f1c88608b95" (UID: "86abc8a1-1d5e-4dad-9e62-4f1c88608b95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.780524 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-run-httpd\") pod \"8ac5c16c-56ac-4299-ae61-a8200986ce10\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.780590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-scripts\") pod \"8ac5c16c-56ac-4299-ae61-a8200986ce10\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.780625 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-combined-ca-bundle\") pod \"8ac5c16c-56ac-4299-ae61-a8200986ce10\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.780756 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-log-httpd\") pod \"8ac5c16c-56ac-4299-ae61-a8200986ce10\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.780916 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8h8k\" (UniqueName: \"kubernetes.io/projected/8ac5c16c-56ac-4299-ae61-a8200986ce10-kube-api-access-w8h8k\") pod \"8ac5c16c-56ac-4299-ae61-a8200986ce10\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.780979 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-config-data\") pod \"8ac5c16c-56ac-4299-ae61-a8200986ce10\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.781138 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-sg-core-conf-yaml\") pod \"8ac5c16c-56ac-4299-ae61-a8200986ce10\" (UID: \"8ac5c16c-56ac-4299-ae61-a8200986ce10\") " Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.781852 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bzmg\" (UniqueName: \"kubernetes.io/projected/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-kube-api-access-6bzmg\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.781892 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.781905 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.781918 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.781930 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86abc8a1-1d5e-4dad-9e62-4f1c88608b95-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.785969 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8ac5c16c-56ac-4299-ae61-a8200986ce10" (UID: "8ac5c16c-56ac-4299-ae61-a8200986ce10"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.787359 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8ac5c16c-56ac-4299-ae61-a8200986ce10" (UID: "8ac5c16c-56ac-4299-ae61-a8200986ce10"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.787850 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac5c16c-56ac-4299-ae61-a8200986ce10-kube-api-access-w8h8k" (OuterVolumeSpecName: "kube-api-access-w8h8k") pod "8ac5c16c-56ac-4299-ae61-a8200986ce10" (UID: "8ac5c16c-56ac-4299-ae61-a8200986ce10"). InnerVolumeSpecName "kube-api-access-w8h8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.790439 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-scripts" (OuterVolumeSpecName: "scripts") pod "8ac5c16c-56ac-4299-ae61-a8200986ce10" (UID: "8ac5c16c-56ac-4299-ae61-a8200986ce10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.787602 4775 scope.go:117] "RemoveContainer" containerID="9e09a5090c4d61aabf6c91b2097f27d3892ae005a5644d72b8acf7ad49174027" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.817496 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8ac5c16c-56ac-4299-ae61-a8200986ce10" (UID: "8ac5c16c-56ac-4299-ae61-a8200986ce10"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.858613 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ac5c16c-56ac-4299-ae61-a8200986ce10" (UID: "8ac5c16c-56ac-4299-ae61-a8200986ce10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.869215 4775 scope.go:117] "RemoveContainer" containerID="da40033a5dd52e534fbf0e897d41a379c55c497ad81da6c097f013d7acf158d3" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.884252 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.884292 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8h8k\" (UniqueName: \"kubernetes.io/projected/8ac5c16c-56ac-4299-ae61-a8200986ce10-kube-api-access-w8h8k\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.884301 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.884309 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ac5c16c-56ac-4299-ae61-a8200986ce10-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.884317 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.884325 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.899727 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-config-data" (OuterVolumeSpecName: "config-data") pod "8ac5c16c-56ac-4299-ae61-a8200986ce10" (UID: "8ac5c16c-56ac-4299-ae61-a8200986ce10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.926552 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7d65998c7c-prp5b" Mar 21 05:08:25 crc kubenswrapper[4775]: I0321 05:08:25.986758 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac5c16c-56ac-4299-ae61-a8200986ce10-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:26 crc kubenswrapper[4775]: W0321 05:08:26.017416 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22a5e145_5b67_4d02_9104_ade3e48888db.slice/crio-e609ddd387c7c54bd1869b55edbc31d421deac134c26c25fe6a20a50afdcea02 WatchSource:0}: Error finding container e609ddd387c7c54bd1869b55edbc31d421deac134c26c25fe6a20a50afdcea02: Status 404 returned error can't find the container with id e609ddd387c7c54bd1869b55edbc31d421deac134c26c25fe6a20a50afdcea02 Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.026997 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db3c-account-create-update-k29zp"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.043057 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68ccf5bf68-lf5dz"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.043333 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68ccf5bf68-lf5dz" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-api" containerID="cri-o://f14a2e346ace37be895dcc94dd25c65e41716e7e754b07d2e1e3277674aeac90" gracePeriod=30 Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.043477 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68ccf5bf68-lf5dz" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-httpd" containerID="cri-o://06c961a967b1c07bf826487fd3c050eb92ff3106f515b9136df5283020977b92" gracePeriod=30 Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.098849 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.125216 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.142296 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7dc8b6bf8b-btcqr"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.152077 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7dc8b6bf8b-btcqr"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.164620 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:26 crc kubenswrapper[4775]: E0321 05:08:26.165079 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="proxy-httpd" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165095 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="proxy-httpd" Mar 21 05:08:26 crc kubenswrapper[4775]: E0321 05:08:26.165134 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api-log" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165143 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api-log" Mar 21 05:08:26 crc kubenswrapper[4775]: E0321 05:08:26.165161 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165169 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api" Mar 21 05:08:26 crc kubenswrapper[4775]: E0321 05:08:26.165195 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="sg-core" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165202 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="sg-core" Mar 21 05:08:26 crc kubenswrapper[4775]: E0321 05:08:26.165214 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="ceilometer-notification-agent" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165221 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="ceilometer-notification-agent" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165437 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="ceilometer-notification-agent" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165452 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="sg-core" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165472 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165483 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" containerName="barbican-api-log" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.165492 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" containerName="proxy-httpd" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.173451 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.175388 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.178370 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.178859 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.295532 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.295900 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-config-data\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.295947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.295974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.296079 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.296188 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-scripts\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.296380 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cmhz\" (UniqueName: \"kubernetes.io/projected/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-kube-api-access-8cmhz\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.372555 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-45fb-account-create-update-glct2"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.398756 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-config-data\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.398828 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.398866 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.398986 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.399042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-scripts\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.399096 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cmhz\" (UniqueName: \"kubernetes.io/projected/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-kube-api-access-8cmhz\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.399225 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.399744 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.401434 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.416797 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wxmkg"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.425470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-scripts\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.428064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cmhz\" (UniqueName: \"kubernetes.io/projected/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-kube-api-access-8cmhz\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.434473 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-config-data\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.440235 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.446154 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.451305 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a38c-account-create-update-x226p"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.468734 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-tmm94"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.478444 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zfz4w"] Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.600602 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.690650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wxmkg" event={"ID":"b65ed673-344a-4064-ae90-c9d20964c648","Type":"ContainerStarted","Data":"a48ddd114a87c3d3cbfffa70d9394b6b99e8bd71b385ee3f057d2bf9344dd199"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.692398 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f9b88fb79-vclnv" event={"ID":"77432545-f22c-453a-b6a7-7c932712efa9","Type":"ContainerStarted","Data":"3feb14a77d1251016f9a88a978c0277c39b5f555f9ba1d5f8c9fa84f835e2d6d"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.693849 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.693875 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.697135 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a38c-account-create-update-x226p" event={"ID":"028d46f7-6f14-40f6-a6cf-77aeec59a99e","Type":"ContainerStarted","Data":"c62adef6e5f9bf0990f5fcc77903287bc27cc215bf5cbcd7af08a7afb5a9a0dc"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.698168 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zfz4w" event={"ID":"87e82da2-fdd4-4ead-8a9b-022e55a03690","Type":"ContainerStarted","Data":"ecc4e3a464591c3ae12895248b97be713619753b60d765c32197f20eefb25c74"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.699479 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bb5a0456-b5c5-433a-afde-fe38740e2310","Type":"ContainerStarted","Data":"a33772e7a61f00a233eb59cbe3495ba3e1ac72e11bed9784618da270828a271e"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.703345 4775 generic.go:334] "Generic (PLEG): container finished" podID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerID="06c961a967b1c07bf826487fd3c050eb92ff3106f515b9136df5283020977b92" exitCode=0 Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.703584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68ccf5bf68-lf5dz" event={"ID":"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4","Type":"ContainerDied","Data":"06c961a967b1c07bf826487fd3c050eb92ff3106f515b9136df5283020977b92"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.709254 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tmm94" event={"ID":"bc238c51-34cc-4cd6-bdf2-00e3747a9d67","Type":"ContainerStarted","Data":"a900d64d00970943729e3dd5370a47455bba6366eba49077571ff7b009d713fa"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.711906 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-f9b88fb79-vclnv" podUID="77432545-f22c-453a-b6a7-7c932712efa9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.713092 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" event={"ID":"22a5e145-5b67-4d02-9104-ade3e48888db","Type":"ContainerStarted","Data":"eb428452fe127829b5fdd3edea1e21f849f0fb951673f81da2334a3d27807ded"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.713136 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" event={"ID":"22a5e145-5b67-4d02-9104-ade3e48888db","Type":"ContainerStarted","Data":"e609ddd387c7c54bd1869b55edbc31d421deac134c26c25fe6a20a50afdcea02"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.744239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-45fb-account-create-update-glct2" event={"ID":"330e8d60-7af3-42ed-a0d7-234264037d09","Type":"ContainerStarted","Data":"e350cae4a6e948bcd9c31418f7d52ae8e550001eb6a2f7a579e8a1278c01e74c"} Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.745409 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-f9b88fb79-vclnv" podStartSLOduration=12.745394242 podStartE2EDuration="12.745394242s" podCreationTimestamp="2026-03-21 05:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:26.717530435 +0000 UTC m=+1259.693994049" watchObservedRunningTime="2026-03-21 05:08:26.745394242 +0000 UTC m=+1259.721857866" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.750483 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" podStartSLOduration=9.750467826 podStartE2EDuration="9.750467826s" podCreationTimestamp="2026-03-21 05:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:26.747173453 +0000 UTC m=+1259.723637087" watchObservedRunningTime="2026-03-21 05:08:26.750467826 +0000 UTC m=+1259.726931450" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.796792 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.473224251 podStartE2EDuration="17.796769394s" podCreationTimestamp="2026-03-21 05:08:09 +0000 UTC" firstStartedPulling="2026-03-21 05:08:10.048841198 +0000 UTC m=+1243.025304822" lastFinishedPulling="2026-03-21 05:08:25.372386341 +0000 UTC m=+1258.348849965" observedRunningTime="2026-03-21 05:08:26.781081161 +0000 UTC m=+1259.757544775" watchObservedRunningTime="2026-03-21 05:08:26.796769394 +0000 UTC m=+1259.773233018" Mar 21 05:08:26 crc kubenswrapper[4775]: I0321 05:08:26.993039 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.011461 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.684800 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86abc8a1-1d5e-4dad-9e62-4f1c88608b95" path="/var/lib/kubelet/pods/86abc8a1-1d5e-4dad-9e62-4f1c88608b95/volumes" Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.686586 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac5c16c-56ac-4299-ae61-a8200986ce10" path="/var/lib/kubelet/pods/8ac5c16c-56ac-4299-ae61-a8200986ce10/volumes" Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.755134 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-45fb-account-create-update-glct2" event={"ID":"330e8d60-7af3-42ed-a0d7-234264037d09","Type":"ContainerStarted","Data":"1fd0a588f9d34ae16dba50e92cfc4f254b48cad183d5ee54a522ba1c5c91886c"} Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.757319 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerStarted","Data":"03c1e47b1d8dd7264986151b452287f41c3c0c4a42e00d07ccce4db26c6d480e"} Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.765895 4775 generic.go:334] "Generic (PLEG): container finished" podID="028d46f7-6f14-40f6-a6cf-77aeec59a99e" containerID="959f27ca1f8883c27a7d20345a9d58bad88607efdd52bc8596a92a7571fa819e" exitCode=0 Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.765981 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a38c-account-create-update-x226p" event={"ID":"028d46f7-6f14-40f6-a6cf-77aeec59a99e","Type":"ContainerDied","Data":"959f27ca1f8883c27a7d20345a9d58bad88607efdd52bc8596a92a7571fa819e"} Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.770743 4775 generic.go:334] "Generic (PLEG): container finished" podID="87e82da2-fdd4-4ead-8a9b-022e55a03690" containerID="f761be7525081d117b389918983117196fdba3258fd2ecd1f48a78a479770d1c" exitCode=0 Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.770797 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zfz4w" event={"ID":"87e82da2-fdd4-4ead-8a9b-022e55a03690","Type":"ContainerDied","Data":"f761be7525081d117b389918983117196fdba3258fd2ecd1f48a78a479770d1c"} Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.773633 4775 generic.go:334] "Generic (PLEG): container finished" podID="bc238c51-34cc-4cd6-bdf2-00e3747a9d67" containerID="647e5b30ad956a8a9a631808f860ad303cf32d4aa053654a5cdcf1129262af27" exitCode=0 Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.773710 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tmm94" event={"ID":"bc238c51-34cc-4cd6-bdf2-00e3747a9d67","Type":"ContainerDied","Data":"647e5b30ad956a8a9a631808f860ad303cf32d4aa053654a5cdcf1129262af27"} Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.776517 4775 generic.go:334] "Generic (PLEG): container finished" podID="b65ed673-344a-4064-ae90-c9d20964c648" containerID="0ff1159b21db4525f027b0be154a99974ea3e42eac6bac07387f589a08d088ce" exitCode=0 Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.776594 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wxmkg" event={"ID":"b65ed673-344a-4064-ae90-c9d20964c648","Type":"ContainerDied","Data":"0ff1159b21db4525f027b0be154a99974ea3e42eac6bac07387f589a08d088ce"} Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.778318 4775 generic.go:334] "Generic (PLEG): container finished" podID="22a5e145-5b67-4d02-9104-ade3e48888db" containerID="eb428452fe127829b5fdd3edea1e21f849f0fb951673f81da2334a3d27807ded" exitCode=0 Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.778410 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" event={"ID":"22a5e145-5b67-4d02-9104-ade3e48888db","Type":"ContainerDied","Data":"eb428452fe127829b5fdd3edea1e21f849f0fb951673f81da2334a3d27807ded"} Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.793484 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:27 crc kubenswrapper[4775]: I0321 05:08:27.932690 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-45fb-account-create-update-glct2" podStartSLOduration=10.932671226 podStartE2EDuration="10.932671226s" podCreationTimestamp="2026-03-21 05:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:27.901530426 +0000 UTC m=+1260.877994060" watchObservedRunningTime="2026-03-21 05:08:27.932671226 +0000 UTC m=+1260.909134850" Mar 21 05:08:28 crc kubenswrapper[4775]: I0321 05:08:28.787989 4775 generic.go:334] "Generic (PLEG): container finished" podID="330e8d60-7af3-42ed-a0d7-234264037d09" containerID="1fd0a588f9d34ae16dba50e92cfc4f254b48cad183d5ee54a522ba1c5c91886c" exitCode=0 Mar 21 05:08:28 crc kubenswrapper[4775]: I0321 05:08:28.788065 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-45fb-account-create-update-glct2" event={"ID":"330e8d60-7af3-42ed-a0d7-234264037d09","Type":"ContainerDied","Data":"1fd0a588f9d34ae16dba50e92cfc4f254b48cad183d5ee54a522ba1c5c91886c"} Mar 21 05:08:28 crc kubenswrapper[4775]: I0321 05:08:28.790369 4775 generic.go:334] "Generic (PLEG): container finished" podID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerID="f14a2e346ace37be895dcc94dd25c65e41716e7e754b07d2e1e3277674aeac90" exitCode=0 Mar 21 05:08:28 crc kubenswrapper[4775]: I0321 05:08:28.790487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68ccf5bf68-lf5dz" event={"ID":"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4","Type":"ContainerDied","Data":"f14a2e346ace37be895dcc94dd25c65e41716e7e754b07d2e1e3277674aeac90"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.267747 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.366573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcbnj\" (UniqueName: \"kubernetes.io/projected/028d46f7-6f14-40f6-a6cf-77aeec59a99e-kube-api-access-jcbnj\") pod \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.367913 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/028d46f7-6f14-40f6-a6cf-77aeec59a99e-operator-scripts\") pod \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\" (UID: \"028d46f7-6f14-40f6-a6cf-77aeec59a99e\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.368429 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/028d46f7-6f14-40f6-a6cf-77aeec59a99e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "028d46f7-6f14-40f6-a6cf-77aeec59a99e" (UID: "028d46f7-6f14-40f6-a6cf-77aeec59a99e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.368920 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/028d46f7-6f14-40f6-a6cf-77aeec59a99e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.370729 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028d46f7-6f14-40f6-a6cf-77aeec59a99e-kube-api-access-jcbnj" (OuterVolumeSpecName: "kube-api-access-jcbnj") pod "028d46f7-6f14-40f6-a6cf-77aeec59a99e" (UID: "028d46f7-6f14-40f6-a6cf-77aeec59a99e"). InnerVolumeSpecName "kube-api-access-jcbnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.459936 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.472820 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.473593 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcbnj\" (UniqueName: \"kubernetes.io/projected/028d46f7-6f14-40f6-a6cf-77aeec59a99e-kube-api-access-jcbnj\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.493916 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.500334 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.581710 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkh9x\" (UniqueName: \"kubernetes.io/projected/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-kube-api-access-xkh9x\") pod \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.581840 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txzb7\" (UniqueName: \"kubernetes.io/projected/87e82da2-fdd4-4ead-8a9b-022e55a03690-kube-api-access-txzb7\") pod \"87e82da2-fdd4-4ead-8a9b-022e55a03690\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.581938 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a5e145-5b67-4d02-9104-ade3e48888db-operator-scripts\") pod \"22a5e145-5b67-4d02-9104-ade3e48888db\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.581971 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqfwp\" (UniqueName: \"kubernetes.io/projected/22a5e145-5b67-4d02-9104-ade3e48888db-kube-api-access-lqfwp\") pod \"22a5e145-5b67-4d02-9104-ade3e48888db\" (UID: \"22a5e145-5b67-4d02-9104-ade3e48888db\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.582047 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-operator-scripts\") pod \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\" (UID: \"bc238c51-34cc-4cd6-bdf2-00e3747a9d67\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.582136 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e82da2-fdd4-4ead-8a9b-022e55a03690-operator-scripts\") pod \"87e82da2-fdd4-4ead-8a9b-022e55a03690\" (UID: \"87e82da2-fdd4-4ead-8a9b-022e55a03690\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.584213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc238c51-34cc-4cd6-bdf2-00e3747a9d67" (UID: "bc238c51-34cc-4cd6-bdf2-00e3747a9d67"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.584353 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e82da2-fdd4-4ead-8a9b-022e55a03690-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87e82da2-fdd4-4ead-8a9b-022e55a03690" (UID: "87e82da2-fdd4-4ead-8a9b-022e55a03690"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.584818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a5e145-5b67-4d02-9104-ade3e48888db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22a5e145-5b67-4d02-9104-ade3e48888db" (UID: "22a5e145-5b67-4d02-9104-ade3e48888db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.604939 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e82da2-fdd4-4ead-8a9b-022e55a03690-kube-api-access-txzb7" (OuterVolumeSpecName: "kube-api-access-txzb7") pod "87e82da2-fdd4-4ead-8a9b-022e55a03690" (UID: "87e82da2-fdd4-4ead-8a9b-022e55a03690"). InnerVolumeSpecName "kube-api-access-txzb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.604988 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a5e145-5b67-4d02-9104-ade3e48888db-kube-api-access-lqfwp" (OuterVolumeSpecName: "kube-api-access-lqfwp") pod "22a5e145-5b67-4d02-9104-ade3e48888db" (UID: "22a5e145-5b67-4d02-9104-ade3e48888db"). InnerVolumeSpecName "kube-api-access-lqfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.611347 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-kube-api-access-xkh9x" (OuterVolumeSpecName: "kube-api-access-xkh9x") pod "bc238c51-34cc-4cd6-bdf2-00e3747a9d67" (UID: "bc238c51-34cc-4cd6-bdf2-00e3747a9d67"). InnerVolumeSpecName "kube-api-access-xkh9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.683706 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ed673-344a-4064-ae90-c9d20964c648-operator-scripts\") pod \"b65ed673-344a-4064-ae90-c9d20964c648\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.684037 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb6wb\" (UniqueName: \"kubernetes.io/projected/b65ed673-344a-4064-ae90-c9d20964c648-kube-api-access-wb6wb\") pod \"b65ed673-344a-4064-ae90-c9d20964c648\" (UID: \"b65ed673-344a-4064-ae90-c9d20964c648\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.684469 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e82da2-fdd4-4ead-8a9b-022e55a03690-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.684485 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkh9x\" (UniqueName: \"kubernetes.io/projected/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-kube-api-access-xkh9x\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.684494 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txzb7\" (UniqueName: \"kubernetes.io/projected/87e82da2-fdd4-4ead-8a9b-022e55a03690-kube-api-access-txzb7\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.684503 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22a5e145-5b67-4d02-9104-ade3e48888db-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.684512 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqfwp\" (UniqueName: \"kubernetes.io/projected/22a5e145-5b67-4d02-9104-ade3e48888db-kube-api-access-lqfwp\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.684520 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc238c51-34cc-4cd6-bdf2-00e3747a9d67-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.686767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65ed673-344a-4064-ae90-c9d20964c648-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b65ed673-344a-4064-ae90-c9d20964c648" (UID: "b65ed673-344a-4064-ae90-c9d20964c648"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.688413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65ed673-344a-4064-ae90-c9d20964c648-kube-api-access-wb6wb" (OuterVolumeSpecName: "kube-api-access-wb6wb") pod "b65ed673-344a-4064-ae90-c9d20964c648" (UID: "b65ed673-344a-4064-ae90-c9d20964c648"). InnerVolumeSpecName "kube-api-access-wb6wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.689690 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.785594 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-combined-ca-bundle\") pod \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.785687 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-ovndb-tls-certs\") pod \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.785753 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-config\") pod \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.785831 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-httpd-config\") pod \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.785915 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv44z\" (UniqueName: \"kubernetes.io/projected/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-kube-api-access-bv44z\") pod \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\" (UID: \"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4\") " Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.786326 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb6wb\" (UniqueName: \"kubernetes.io/projected/b65ed673-344a-4064-ae90-c9d20964c648-kube-api-access-wb6wb\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.786344 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b65ed673-344a-4064-ae90-c9d20964c648-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.790355 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" (UID: "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.790941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-kube-api-access-bv44z" (OuterVolumeSpecName: "kube-api-access-bv44z") pod "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" (UID: "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4"). InnerVolumeSpecName "kube-api-access-bv44z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.809597 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-tmm94" event={"ID":"bc238c51-34cc-4cd6-bdf2-00e3747a9d67","Type":"ContainerDied","Data":"a900d64d00970943729e3dd5370a47455bba6366eba49077571ff7b009d713fa"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.809671 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a900d64d00970943729e3dd5370a47455bba6366eba49077571ff7b009d713fa" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.809759 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-tmm94" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.812500 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" event={"ID":"22a5e145-5b67-4d02-9104-ade3e48888db","Type":"ContainerDied","Data":"e609ddd387c7c54bd1869b55edbc31d421deac134c26c25fe6a20a50afdcea02"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.812550 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e609ddd387c7c54bd1869b55edbc31d421deac134c26c25fe6a20a50afdcea02" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.812592 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db3c-account-create-update-k29zp" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.814066 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerStarted","Data":"15da433bbe56b39c4ebf0eff9011ebe47b1beb92e06cabf9cd3d182dd0437ac0"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.815867 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a38c-account-create-update-x226p" event={"ID":"028d46f7-6f14-40f6-a6cf-77aeec59a99e","Type":"ContainerDied","Data":"c62adef6e5f9bf0990f5fcc77903287bc27cc215bf5cbcd7af08a7afb5a9a0dc"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.815904 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c62adef6e5f9bf0990f5fcc77903287bc27cc215bf5cbcd7af08a7afb5a9a0dc" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.815959 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a38c-account-create-update-x226p" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.818638 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zfz4w" event={"ID":"87e82da2-fdd4-4ead-8a9b-022e55a03690","Type":"ContainerDied","Data":"ecc4e3a464591c3ae12895248b97be713619753b60d765c32197f20eefb25c74"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.818669 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc4e3a464591c3ae12895248b97be713619753b60d765c32197f20eefb25c74" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.818725 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zfz4w" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.821455 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68ccf5bf68-lf5dz" event={"ID":"6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4","Type":"ContainerDied","Data":"785cf1f0611121bc184a4bc0f34377c346c490853673595b9763600cad5d4149"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.821498 4775 scope.go:117] "RemoveContainer" containerID="06c961a967b1c07bf826487fd3c050eb92ff3106f515b9136df5283020977b92" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.821626 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68ccf5bf68-lf5dz" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.827663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wxmkg" event={"ID":"b65ed673-344a-4064-ae90-c9d20964c648","Type":"ContainerDied","Data":"a48ddd114a87c3d3cbfffa70d9394b6b99e8bd71b385ee3f057d2bf9344dd199"} Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.827728 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48ddd114a87c3d3cbfffa70d9394b6b99e8bd71b385ee3f057d2bf9344dd199" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.827814 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wxmkg" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.854287 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-config" (OuterVolumeSpecName: "config") pod "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" (UID: "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.854430 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" (UID: "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.878549 4775 scope.go:117] "RemoveContainer" containerID="f14a2e346ace37be895dcc94dd25c65e41716e7e754b07d2e1e3277674aeac90" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.888543 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.888572 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.888589 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.888600 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv44z\" (UniqueName: \"kubernetes.io/projected/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-kube-api-access-bv44z\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.911828 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" (UID: "6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:29 crc kubenswrapper[4775]: I0321 05:08:29.990733 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.137103 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.243602 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68ccf5bf68-lf5dz"] Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.253877 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-68ccf5bf68-lf5dz"] Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.299220 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330e8d60-7af3-42ed-a0d7-234264037d09-operator-scripts\") pod \"330e8d60-7af3-42ed-a0d7-234264037d09\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.299614 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47c7f\" (UniqueName: \"kubernetes.io/projected/330e8d60-7af3-42ed-a0d7-234264037d09-kube-api-access-47c7f\") pod \"330e8d60-7af3-42ed-a0d7-234264037d09\" (UID: \"330e8d60-7af3-42ed-a0d7-234264037d09\") " Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.299779 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/330e8d60-7af3-42ed-a0d7-234264037d09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "330e8d60-7af3-42ed-a0d7-234264037d09" (UID: "330e8d60-7af3-42ed-a0d7-234264037d09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.300395 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/330e8d60-7af3-42ed-a0d7-234264037d09-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.311898 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f9b88fb79-vclnv" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.311981 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330e8d60-7af3-42ed-a0d7-234264037d09-kube-api-access-47c7f" (OuterVolumeSpecName: "kube-api-access-47c7f") pod "330e8d60-7af3-42ed-a0d7-234264037d09" (UID: "330e8d60-7af3-42ed-a0d7-234264037d09"). InnerVolumeSpecName "kube-api-access-47c7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.401987 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47c7f\" (UniqueName: \"kubernetes.io/projected/330e8d60-7af3-42ed-a0d7-234264037d09-kube-api-access-47c7f\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.852020 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerStarted","Data":"59ff223fce72d0f0a5dcf748449e4af7793a03edd914b1484e21b7aca4e40d65"} Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.880594 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-45fb-account-create-update-glct2" Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.880870 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-45fb-account-create-update-glct2" event={"ID":"330e8d60-7af3-42ed-a0d7-234264037d09","Type":"ContainerDied","Data":"e350cae4a6e948bcd9c31418f7d52ae8e550001eb6a2f7a579e8a1278c01e74c"} Mar 21 05:08:30 crc kubenswrapper[4775]: I0321 05:08:30.881316 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e350cae4a6e948bcd9c31418f7d52ae8e550001eb6a2f7a579e8a1278c01e74c" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.335545 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.425189 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-tls-certs\") pod \"09cf3763-0b41-4452-a247-d9a56f58b05d\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.425537 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-config-data\") pod \"09cf3763-0b41-4452-a247-d9a56f58b05d\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.425590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-combined-ca-bundle\") pod \"09cf3763-0b41-4452-a247-d9a56f58b05d\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.425695 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cf3763-0b41-4452-a247-d9a56f58b05d-logs\") pod \"09cf3763-0b41-4452-a247-d9a56f58b05d\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.425713 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4gg\" (UniqueName: \"kubernetes.io/projected/09cf3763-0b41-4452-a247-d9a56f58b05d-kube-api-access-rg4gg\") pod \"09cf3763-0b41-4452-a247-d9a56f58b05d\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.425745 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-scripts\") pod \"09cf3763-0b41-4452-a247-d9a56f58b05d\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.426251 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09cf3763-0b41-4452-a247-d9a56f58b05d-logs" (OuterVolumeSpecName: "logs") pod "09cf3763-0b41-4452-a247-d9a56f58b05d" (UID: "09cf3763-0b41-4452-a247-d9a56f58b05d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.426407 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-secret-key\") pod \"09cf3763-0b41-4452-a247-d9a56f58b05d\" (UID: \"09cf3763-0b41-4452-a247-d9a56f58b05d\") " Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.426834 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09cf3763-0b41-4452-a247-d9a56f58b05d-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.430208 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "09cf3763-0b41-4452-a247-d9a56f58b05d" (UID: "09cf3763-0b41-4452-a247-d9a56f58b05d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.434939 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cf3763-0b41-4452-a247-d9a56f58b05d-kube-api-access-rg4gg" (OuterVolumeSpecName: "kube-api-access-rg4gg") pod "09cf3763-0b41-4452-a247-d9a56f58b05d" (UID: "09cf3763-0b41-4452-a247-d9a56f58b05d"). InnerVolumeSpecName "kube-api-access-rg4gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.456173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-scripts" (OuterVolumeSpecName: "scripts") pod "09cf3763-0b41-4452-a247-d9a56f58b05d" (UID: "09cf3763-0b41-4452-a247-d9a56f58b05d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.458027 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09cf3763-0b41-4452-a247-d9a56f58b05d" (UID: "09cf3763-0b41-4452-a247-d9a56f58b05d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.459870 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-config-data" (OuterVolumeSpecName: "config-data") pod "09cf3763-0b41-4452-a247-d9a56f58b05d" (UID: "09cf3763-0b41-4452-a247-d9a56f58b05d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.485628 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "09cf3763-0b41-4452-a247-d9a56f58b05d" (UID: "09cf3763-0b41-4452-a247-d9a56f58b05d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.528887 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4gg\" (UniqueName: \"kubernetes.io/projected/09cf3763-0b41-4452-a247-d9a56f58b05d-kube-api-access-rg4gg\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.528930 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.528941 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.528949 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.528957 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09cf3763-0b41-4452-a247-d9a56f58b05d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.528965 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09cf3763-0b41-4452-a247-d9a56f58b05d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.672837 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" path="/var/lib/kubelet/pods/6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4/volumes" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.892063 4775 generic.go:334] "Generic (PLEG): container finished" podID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerID="4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd" exitCode=137 Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.892141 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64fb567758-hd2ld" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.892971 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64fb567758-hd2ld" event={"ID":"09cf3763-0b41-4452-a247-d9a56f58b05d","Type":"ContainerDied","Data":"4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd"} Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.893062 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64fb567758-hd2ld" event={"ID":"09cf3763-0b41-4452-a247-d9a56f58b05d","Type":"ContainerDied","Data":"7045bd948f55810ab12433655a5f2808bafab5e3e89d9d68cb00186c737c1405"} Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.893095 4775 scope.go:117] "RemoveContainer" containerID="1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa" Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.894438 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerStarted","Data":"6118bee8bc5688dea0b47263f9e7450128104ab9d8f36622df267e16e2ba897f"} Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.915937 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64fb567758-hd2ld"] Mar 21 05:08:31 crc kubenswrapper[4775]: I0321 05:08:31.931086 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64fb567758-hd2ld"] Mar 21 05:08:32 crc kubenswrapper[4775]: I0321 05:08:32.059407 4775 scope.go:117] "RemoveContainer" containerID="4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd" Mar 21 05:08:32 crc kubenswrapper[4775]: I0321 05:08:32.084702 4775 scope.go:117] "RemoveContainer" containerID="1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa" Mar 21 05:08:32 crc kubenswrapper[4775]: E0321 05:08:32.085257 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa\": container with ID starting with 1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa not found: ID does not exist" containerID="1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa" Mar 21 05:08:32 crc kubenswrapper[4775]: I0321 05:08:32.085293 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa"} err="failed to get container status \"1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa\": rpc error: code = NotFound desc = could not find container \"1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa\": container with ID starting with 1da6670625f81794dc89cffb06ac6d008a9c7c8ed1b16d49c09b2332ff0e94fa not found: ID does not exist" Mar 21 05:08:32 crc kubenswrapper[4775]: I0321 05:08:32.085319 4775 scope.go:117] "RemoveContainer" containerID="4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd" Mar 21 05:08:32 crc kubenswrapper[4775]: E0321 05:08:32.085549 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd\": container with ID starting with 4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd not found: ID does not exist" containerID="4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd" Mar 21 05:08:32 crc kubenswrapper[4775]: I0321 05:08:32.085644 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd"} err="failed to get container status \"4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd\": rpc error: code = NotFound desc = could not find container \"4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd\": container with ID starting with 4df1223c20d1958e562bffafa09f6a9587b632ec38a4f1dbf6a26792e8ce21fd not found: ID does not exist" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.179310 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gk78h"] Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180453 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330e8d60-7af3-42ed-a0d7-234264037d09" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180466 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="330e8d60-7af3-42ed-a0d7-234264037d09" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180497 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a5e145-5b67-4d02-9104-ade3e48888db" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180504 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a5e145-5b67-4d02-9104-ade3e48888db" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180518 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-httpd" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180525 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-httpd" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180537 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180544 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180553 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028d46f7-6f14-40f6-a6cf-77aeec59a99e" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180562 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="028d46f7-6f14-40f6-a6cf-77aeec59a99e" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180578 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e82da2-fdd4-4ead-8a9b-022e55a03690" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180584 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e82da2-fdd4-4ead-8a9b-022e55a03690" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180606 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-api" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180613 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-api" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180635 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon-log" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180641 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon-log" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180655 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc238c51-34cc-4cd6-bdf2-00e3747a9d67" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180662 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc238c51-34cc-4cd6-bdf2-00e3747a9d67" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: E0321 05:08:33.180690 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65ed673-344a-4064-ae90-c9d20964c648" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180696 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65ed673-344a-4064-ae90-c9d20964c648" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180977 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65ed673-344a-4064-ae90-c9d20964c648" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.180995 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc238c51-34cc-4cd6-bdf2-00e3747a9d67" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181012 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181027 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" containerName="horizon-log" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181049 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e82da2-fdd4-4ead-8a9b-022e55a03690" containerName="mariadb-database-create" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181074 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="330e8d60-7af3-42ed-a0d7-234264037d09" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181093 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="028d46f7-6f14-40f6-a6cf-77aeec59a99e" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181136 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-httpd" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181159 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc5e1ad-635a-478c-8fac-ac1fcdb2bad4" containerName="neutron-api" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181179 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a5e145-5b67-4d02-9104-ade3e48888db" containerName="mariadb-account-create-update" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.181890 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.187784 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bhr6z" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.188015 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.190172 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.206791 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gk78h"] Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.265053 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-scripts\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.265204 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.265237 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jrc\" (UniqueName: \"kubernetes.io/projected/824625b1-30cc-42c5-ad83-5854770c2f46-kube-api-access-59jrc\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.265285 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-config-data\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.366777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.366869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jrc\" (UniqueName: \"kubernetes.io/projected/824625b1-30cc-42c5-ad83-5854770c2f46-kube-api-access-59jrc\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.366928 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-config-data\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.367024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-scripts\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.371887 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-scripts\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.374649 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-config-data\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.375005 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.392241 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jrc\" (UniqueName: \"kubernetes.io/projected/824625b1-30cc-42c5-ad83-5854770c2f46-kube-api-access-59jrc\") pod \"nova-cell0-conductor-db-sync-gk78h\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.506875 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.560240 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.687794 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09cf3763-0b41-4452-a247-d9a56f58b05d" path="/var/lib/kubelet/pods/09cf3763-0b41-4452-a247-d9a56f58b05d/volumes" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.943157 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gk78h"] Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.957344 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerStarted","Data":"07f243e539099e70b23a12537cd33bb12a02581f84122079a9f29395c247bf3f"} Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.958440 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:08:33 crc kubenswrapper[4775]: I0321 05:08:33.989667 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.087781939 podStartE2EDuration="7.989647859s" podCreationTimestamp="2026-03-21 05:08:26 +0000 UTC" firstStartedPulling="2026-03-21 05:08:27.011243565 +0000 UTC m=+1259.987707189" lastFinishedPulling="2026-03-21 05:08:32.913109495 +0000 UTC m=+1265.889573109" observedRunningTime="2026-03-21 05:08:33.98226302 +0000 UTC m=+1266.958726644" watchObservedRunningTime="2026-03-21 05:08:33.989647859 +0000 UTC m=+1266.966111483" Mar 21 05:08:34 crc kubenswrapper[4775]: I0321 05:08:34.969356 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gk78h" event={"ID":"824625b1-30cc-42c5-ad83-5854770c2f46","Type":"ContainerStarted","Data":"2d77ca12f9d9586582c849b2cbdaffac8b8dcfcc0b7a10cdf251cde2b34e8f90"} Mar 21 05:08:34 crc kubenswrapper[4775]: I0321 05:08:34.970063 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-central-agent" containerID="cri-o://15da433bbe56b39c4ebf0eff9011ebe47b1beb92e06cabf9cd3d182dd0437ac0" gracePeriod=30 Mar 21 05:08:34 crc kubenswrapper[4775]: I0321 05:08:34.970895 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="proxy-httpd" containerID="cri-o://07f243e539099e70b23a12537cd33bb12a02581f84122079a9f29395c247bf3f" gracePeriod=30 Mar 21 05:08:34 crc kubenswrapper[4775]: I0321 05:08:34.970889 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="sg-core" containerID="cri-o://6118bee8bc5688dea0b47263f9e7450128104ab9d8f36622df267e16e2ba897f" gracePeriod=30 Mar 21 05:08:34 crc kubenswrapper[4775]: I0321 05:08:34.970924 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-notification-agent" containerID="cri-o://59ff223fce72d0f0a5dcf748449e4af7793a03edd914b1484e21b7aca4e40d65" gracePeriod=30 Mar 21 05:08:35 crc kubenswrapper[4775]: I0321 05:08:35.982750 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerID="07f243e539099e70b23a12537cd33bb12a02581f84122079a9f29395c247bf3f" exitCode=0 Mar 21 05:08:35 crc kubenswrapper[4775]: I0321 05:08:35.983100 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerID="6118bee8bc5688dea0b47263f9e7450128104ab9d8f36622df267e16e2ba897f" exitCode=2 Mar 21 05:08:35 crc kubenswrapper[4775]: I0321 05:08:35.983148 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerID="59ff223fce72d0f0a5dcf748449e4af7793a03edd914b1484e21b7aca4e40d65" exitCode=0 Mar 21 05:08:35 crc kubenswrapper[4775]: I0321 05:08:35.982832 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerDied","Data":"07f243e539099e70b23a12537cd33bb12a02581f84122079a9f29395c247bf3f"} Mar 21 05:08:35 crc kubenswrapper[4775]: I0321 05:08:35.983370 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerDied","Data":"6118bee8bc5688dea0b47263f9e7450128104ab9d8f36622df267e16e2ba897f"} Mar 21 05:08:35 crc kubenswrapper[4775]: I0321 05:08:35.983388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerDied","Data":"59ff223fce72d0f0a5dcf748449e4af7793a03edd914b1484e21b7aca4e40d65"} Mar 21 05:08:37 crc kubenswrapper[4775]: I0321 05:08:37.949363 4775 scope.go:117] "RemoveContainer" containerID="a9b0f66cd072ab055301f7bd9fc0725dbbd203bf7fa9a48d07d610fb2904d30b" Mar 21 05:08:38 crc kubenswrapper[4775]: I0321 05:08:38.004521 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerID="15da433bbe56b39c4ebf0eff9011ebe47b1beb92e06cabf9cd3d182dd0437ac0" exitCode=0 Mar 21 05:08:38 crc kubenswrapper[4775]: I0321 05:08:38.004562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerDied","Data":"15da433bbe56b39c4ebf0eff9011ebe47b1beb92e06cabf9cd3d182dd0437ac0"} Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.174848 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.253320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-sg-core-conf-yaml\") pod \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.253377 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-run-httpd\") pod \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.253780 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3fcc332-1498-4e1d-bc39-58c203a4b6ad" (UID: "d3fcc332-1498-4e1d-bc39-58c203a4b6ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.282072 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3fcc332-1498-4e1d-bc39-58c203a4b6ad" (UID: "d3fcc332-1498-4e1d-bc39-58c203a4b6ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.354885 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cmhz\" (UniqueName: \"kubernetes.io/projected/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-kube-api-access-8cmhz\") pod \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.354928 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-config-data\") pod \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.354995 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-combined-ca-bundle\") pod \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.355038 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-scripts\") pod \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.355075 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-log-httpd\") pod \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\" (UID: \"d3fcc332-1498-4e1d-bc39-58c203a4b6ad\") " Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.355463 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.355488 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.355954 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3fcc332-1498-4e1d-bc39-58c203a4b6ad" (UID: "d3fcc332-1498-4e1d-bc39-58c203a4b6ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.367399 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-kube-api-access-8cmhz" (OuterVolumeSpecName: "kube-api-access-8cmhz") pod "d3fcc332-1498-4e1d-bc39-58c203a4b6ad" (UID: "d3fcc332-1498-4e1d-bc39-58c203a4b6ad"). InnerVolumeSpecName "kube-api-access-8cmhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.370751 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-scripts" (OuterVolumeSpecName: "scripts") pod "d3fcc332-1498-4e1d-bc39-58c203a4b6ad" (UID: "d3fcc332-1498-4e1d-bc39-58c203a4b6ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.457612 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cmhz\" (UniqueName: \"kubernetes.io/projected/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-kube-api-access-8cmhz\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.457959 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.457977 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.468189 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3fcc332-1498-4e1d-bc39-58c203a4b6ad" (UID: "d3fcc332-1498-4e1d-bc39-58c203a4b6ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.480678 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-config-data" (OuterVolumeSpecName: "config-data") pod "d3fcc332-1498-4e1d-bc39-58c203a4b6ad" (UID: "d3fcc332-1498-4e1d-bc39-58c203a4b6ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.558901 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:42 crc kubenswrapper[4775]: I0321 05:08:42.558941 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fcc332-1498-4e1d-bc39-58c203a4b6ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.081865 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gk78h" event={"ID":"824625b1-30cc-42c5-ad83-5854770c2f46","Type":"ContainerStarted","Data":"20d707fa7f70310d9cb80b150c04d331c2047a421ec95e9ba298af9cad00d19d"} Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.085560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fcc332-1498-4e1d-bc39-58c203a4b6ad","Type":"ContainerDied","Data":"03c1e47b1d8dd7264986151b452287f41c3c0c4a42e00d07ccce4db26c6d480e"} Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.085674 4775 scope.go:117] "RemoveContainer" containerID="07f243e539099e70b23a12537cd33bb12a02581f84122079a9f29395c247bf3f" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.085843 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.114605 4775 scope.go:117] "RemoveContainer" containerID="6118bee8bc5688dea0b47263f9e7450128104ab9d8f36622df267e16e2ba897f" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.132917 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gk78h" podStartSLOduration=1.900723777 podStartE2EDuration="10.132897913s" podCreationTimestamp="2026-03-21 05:08:33 +0000 UTC" firstStartedPulling="2026-03-21 05:08:33.946712896 +0000 UTC m=+1266.923176520" lastFinishedPulling="2026-03-21 05:08:42.178887032 +0000 UTC m=+1275.155350656" observedRunningTime="2026-03-21 05:08:43.099986923 +0000 UTC m=+1276.076450587" watchObservedRunningTime="2026-03-21 05:08:43.132897913 +0000 UTC m=+1276.109361537" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.135618 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.142318 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.154614 4775 scope.go:117] "RemoveContainer" containerID="59ff223fce72d0f0a5dcf748449e4af7793a03edd914b1484e21b7aca4e40d65" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.156233 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:43 crc kubenswrapper[4775]: E0321 05:08:43.156753 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="proxy-httpd" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.156774 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="proxy-httpd" Mar 21 05:08:43 crc kubenswrapper[4775]: E0321 05:08:43.156788 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-central-agent" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.156794 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-central-agent" Mar 21 05:08:43 crc kubenswrapper[4775]: E0321 05:08:43.156841 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="sg-core" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.156847 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="sg-core" Mar 21 05:08:43 crc kubenswrapper[4775]: E0321 05:08:43.156859 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-notification-agent" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.156864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-notification-agent" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.157059 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-notification-agent" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.157080 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="ceilometer-central-agent" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.157090 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="proxy-httpd" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.157100 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" containerName="sg-core" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.158797 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.164466 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.164771 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172204 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172307 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4msqv\" (UniqueName: \"kubernetes.io/projected/ab91b8d2-32c9-4542-a730-7fc84e9892bd-kube-api-access-4msqv\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172430 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-config-data\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172449 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-scripts\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.172498 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.209380 4775 scope.go:117] "RemoveContainer" containerID="15da433bbe56b39c4ebf0eff9011ebe47b1beb92e06cabf9cd3d182dd0437ac0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.273974 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.274020 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4msqv\" (UniqueName: \"kubernetes.io/projected/ab91b8d2-32c9-4542-a730-7fc84e9892bd-kube-api-access-4msqv\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.274417 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.274515 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-config-data\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.274633 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-scripts\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.274801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.275039 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.275486 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.275434 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.278518 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.279735 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-scripts\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.279952 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.280527 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-config-data\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.298058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4msqv\" (UniqueName: \"kubernetes.io/projected/ab91b8d2-32c9-4542-a730-7fc84e9892bd-kube-api-access-4msqv\") pod \"ceilometer-0\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.478187 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.681742 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fcc332-1498-4e1d-bc39-58c203a4b6ad" path="/var/lib/kubelet/pods/d3fcc332-1498-4e1d-bc39-58c203a4b6ad/volumes" Mar 21 05:08:43 crc kubenswrapper[4775]: I0321 05:08:43.950957 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:44 crc kubenswrapper[4775]: I0321 05:08:44.095541 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerStarted","Data":"086d1b17c5dfcb625f7ec04d8ff48e78b1e6cd139ff3016cea3aa265e3ae9e38"} Mar 21 05:08:44 crc kubenswrapper[4775]: I0321 05:08:44.745278 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:08:44 crc kubenswrapper[4775]: I0321 05:08:44.745744 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-log" containerID="cri-o://29a18bd4308c28984ec8d5e5e490324db58708c1d07372b24ee7783e1995157d" gracePeriod=30 Mar 21 05:08:44 crc kubenswrapper[4775]: I0321 05:08:44.745818 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-httpd" containerID="cri-o://0173c3dcb2f1ee362b645abaeead4a322f137b6409341a3b746ad5f19cffafb5" gracePeriod=30 Mar 21 05:08:45 crc kubenswrapper[4775]: I0321 05:08:45.108995 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerStarted","Data":"f3b6a0d856af07fa3386647a1df14c43107fa18d16407653836fe8f6805030db"} Mar 21 05:08:45 crc kubenswrapper[4775]: I0321 05:08:45.111162 4775 generic.go:334] "Generic (PLEG): container finished" podID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerID="29a18bd4308c28984ec8d5e5e490324db58708c1d07372b24ee7783e1995157d" exitCode=143 Mar 21 05:08:45 crc kubenswrapper[4775]: I0321 05:08:45.111206 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47e577d7-e389-4135-b4fb-979bd627eaa9","Type":"ContainerDied","Data":"29a18bd4308c28984ec8d5e5e490324db58708c1d07372b24ee7783e1995157d"} Mar 21 05:08:46 crc kubenswrapper[4775]: I0321 05:08:46.121495 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerStarted","Data":"8cc8d28938a70addf2d6412cf322bdf56956d1adec39274fbeadbaacd685266d"} Mar 21 05:08:46 crc kubenswrapper[4775]: I0321 05:08:46.319343 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:08:46 crc kubenswrapper[4775]: I0321 05:08:46.319854 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-log" containerID="cri-o://b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268" gracePeriod=30 Mar 21 05:08:46 crc kubenswrapper[4775]: I0321 05:08:46.320619 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-httpd" containerID="cri-o://9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405" gracePeriod=30 Mar 21 05:08:46 crc kubenswrapper[4775]: I0321 05:08:46.391961 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:08:47 crc kubenswrapper[4775]: I0321 05:08:47.156038 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerStarted","Data":"756b7ac25394ff7e03c954f4dded0675f6295888345b0b7a098c3ad32426a8ef"} Mar 21 05:08:47 crc kubenswrapper[4775]: I0321 05:08:47.162021 4775 generic.go:334] "Generic (PLEG): container finished" podID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerID="b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268" exitCode=143 Mar 21 05:08:47 crc kubenswrapper[4775]: I0321 05:08:47.162061 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71255c4f-1e47-4e35-845f-876fff5fd6d4","Type":"ContainerDied","Data":"b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268"} Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.184864 4775 generic.go:334] "Generic (PLEG): container finished" podID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerID="0173c3dcb2f1ee362b645abaeead4a322f137b6409341a3b746ad5f19cffafb5" exitCode=0 Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.185150 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47e577d7-e389-4135-b4fb-979bd627eaa9","Type":"ContainerDied","Data":"0173c3dcb2f1ee362b645abaeead4a322f137b6409341a3b746ad5f19cffafb5"} Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.422977 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582065 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-config-data\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582430 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4spw\" (UniqueName: \"kubernetes.io/projected/47e577d7-e389-4135-b4fb-979bd627eaa9-kube-api-access-d4spw\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-scripts\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582496 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-httpd-run\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582574 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-combined-ca-bundle\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582635 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.582670 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-logs\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.583303 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-logs" (OuterVolumeSpecName: "logs") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.583645 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.589307 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-scripts" (OuterVolumeSpecName: "scripts") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.589329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e577d7-e389-4135-b4fb-979bd627eaa9-kube-api-access-d4spw" (OuterVolumeSpecName: "kube-api-access-d4spw") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "kube-api-access-d4spw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.590354 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.625543 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:48 crc kubenswrapper[4775]: E0321 05:08:48.647713 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs podName:47e577d7-e389-4135-b4fb-979bd627eaa9 nodeName:}" failed. No retries permitted until 2026-03-21 05:08:49.147680543 +0000 UTC m=+1282.124144167 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9") : error deleting /var/lib/kubelet/pods/47e577d7-e389-4135-b4fb-979bd627eaa9/volume-subpaths: remove /var/lib/kubelet/pods/47e577d7-e389-4135-b4fb-979bd627eaa9/volume-subpaths: no such file or directory Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.650780 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-config-data" (OuterVolumeSpecName: "config-data") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.684812 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.685145 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.685255 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.685346 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.685432 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4spw\" (UniqueName: \"kubernetes.io/projected/47e577d7-e389-4135-b4fb-979bd627eaa9-kube-api-access-d4spw\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.685520 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.685590 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/47e577d7-e389-4135-b4fb-979bd627eaa9-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.720308 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 21 05:08:48 crc kubenswrapper[4775]: I0321 05:08:48.787714 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.193265 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs\") pod \"47e577d7-e389-4135-b4fb-979bd627eaa9\" (UID: \"47e577d7-e389-4135-b4fb-979bd627eaa9\") " Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.196878 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"47e577d7-e389-4135-b4fb-979bd627eaa9","Type":"ContainerDied","Data":"ea4d39e040f38b33347813ecac429d1646c7747cc95596bc69a68249c6432f4a"} Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.197033 4775 scope.go:117] "RemoveContainer" containerID="0173c3dcb2f1ee362b645abaeead4a322f137b6409341a3b746ad5f19cffafb5" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.197081 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.200914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerStarted","Data":"30455e621655fa15c45c8fad2a223bc59355675a9f5b3da03f27d5155ab96f2d"} Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.201188 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.201103 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="proxy-httpd" containerID="cri-o://30455e621655fa15c45c8fad2a223bc59355675a9f5b3da03f27d5155ab96f2d" gracePeriod=30 Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.201134 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="sg-core" containerID="cri-o://756b7ac25394ff7e03c954f4dded0675f6295888345b0b7a098c3ad32426a8ef" gracePeriod=30 Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.201160 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-notification-agent" containerID="cri-o://8cc8d28938a70addf2d6412cf322bdf56956d1adec39274fbeadbaacd685266d" gracePeriod=30 Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.200942 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-central-agent" containerID="cri-o://f3b6a0d856af07fa3386647a1df14c43107fa18d16407653836fe8f6805030db" gracePeriod=30 Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.203733 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "47e577d7-e389-4135-b4fb-979bd627eaa9" (UID: "47e577d7-e389-4135-b4fb-979bd627eaa9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.271607 4775 scope.go:117] "RemoveContainer" containerID="29a18bd4308c28984ec8d5e5e490324db58708c1d07372b24ee7783e1995157d" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.297000 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47e577d7-e389-4135-b4fb-979bd627eaa9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.632021 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.690917812 podStartE2EDuration="6.63197881s" podCreationTimestamp="2026-03-21 05:08:43 +0000 UTC" firstStartedPulling="2026-03-21 05:08:43.967429776 +0000 UTC m=+1276.943893410" lastFinishedPulling="2026-03-21 05:08:48.908490794 +0000 UTC m=+1281.884954408" observedRunningTime="2026-03-21 05:08:49.225878173 +0000 UTC m=+1282.202341797" watchObservedRunningTime="2026-03-21 05:08:49.63197881 +0000 UTC m=+1282.608442434" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.640615 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.655861 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.673812 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" path="/var/lib/kubelet/pods/47e577d7-e389-4135-b4fb-979bd627eaa9/volumes" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.691237 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:08:49 crc kubenswrapper[4775]: E0321 05:08:49.691797 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-httpd" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.691813 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-httpd" Mar 21 05:08:49 crc kubenswrapper[4775]: E0321 05:08:49.691828 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-log" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.691835 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-log" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.691996 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-httpd" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.692026 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e577d7-e389-4135-b4fb-979bd627eaa9" containerName="glance-log" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.692943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.700756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.701763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.702449 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856027 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856087 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-config-data\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856151 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856188 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01638b90-5e17-43b3-a3b5-90726b26e243-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856292 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01638b90-5e17-43b3-a3b5-90726b26e243-logs\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856326 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-scripts\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghqz9\" (UniqueName: \"kubernetes.io/projected/01638b90-5e17-43b3-a3b5-90726b26e243-kube-api-access-ghqz9\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.856404 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01638b90-5e17-43b3-a3b5-90726b26e243-logs\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-scripts\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghqz9\" (UniqueName: \"kubernetes.io/projected/01638b90-5e17-43b3-a3b5-90726b26e243-kube-api-access-ghqz9\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958873 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-config-data\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.958988 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01638b90-5e17-43b3-a3b5-90726b26e243-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.959887 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.959946 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01638b90-5e17-43b3-a3b5-90726b26e243-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.959887 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01638b90-5e17-43b3-a3b5-90726b26e243-logs\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.975612 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-scripts\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.976613 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.977225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:49 crc kubenswrapper[4775]: I0321 05:08:49.988174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01638b90-5e17-43b3-a3b5-90726b26e243-config-data\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.008229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghqz9\" (UniqueName: \"kubernetes.io/projected/01638b90-5e17-43b3-a3b5-90726b26e243-kube-api-access-ghqz9\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.013584 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"01638b90-5e17-43b3-a3b5-90726b26e243\") " pod="openstack/glance-default-external-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.022846 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.156796 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.257827 4775 generic.go:334] "Generic (PLEG): container finished" podID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerID="756b7ac25394ff7e03c954f4dded0675f6295888345b0b7a098c3ad32426a8ef" exitCode=2 Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.257855 4775 generic.go:334] "Generic (PLEG): container finished" podID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerID="8cc8d28938a70addf2d6412cf322bdf56956d1adec39274fbeadbaacd685266d" exitCode=0 Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.257906 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerDied","Data":"756b7ac25394ff7e03c954f4dded0675f6295888345b0b7a098c3ad32426a8ef"} Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.257934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerDied","Data":"8cc8d28938a70addf2d6412cf322bdf56956d1adec39274fbeadbaacd685266d"} Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.260034 4775 generic.go:334] "Generic (PLEG): container finished" podID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerID="9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405" exitCode=0 Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.260061 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71255c4f-1e47-4e35-845f-876fff5fd6d4","Type":"ContainerDied","Data":"9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405"} Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.260082 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71255c4f-1e47-4e35-845f-876fff5fd6d4","Type":"ContainerDied","Data":"5d037f60a7382103283fb7dc977007a3668afac9d065f14a09fa2c15aa21b79e"} Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.260098 4775 scope.go:117] "RemoveContainer" containerID="9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.260266 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266015 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-combined-ca-bundle\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266061 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266079 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-internal-tls-certs\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266169 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-logs\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266203 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-config-data\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266238 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-scripts\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266291 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-httpd-run\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.266365 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz2pk\" (UniqueName: \"kubernetes.io/projected/71255c4f-1e47-4e35-845f-876fff5fd6d4-kube-api-access-pz2pk\") pod \"71255c4f-1e47-4e35-845f-876fff5fd6d4\" (UID: \"71255c4f-1e47-4e35-845f-876fff5fd6d4\") " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.270881 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-logs" (OuterVolumeSpecName: "logs") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.270901 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.286960 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.287028 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-scripts" (OuterVolumeSpecName: "scripts") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.288100 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71255c4f-1e47-4e35-845f-876fff5fd6d4-kube-api-access-pz2pk" (OuterVolumeSpecName: "kube-api-access-pz2pk") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "kube-api-access-pz2pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.321396 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.332250 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-config-data" (OuterVolumeSpecName: "config-data") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.364473 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "71255c4f-1e47-4e35-845f-876fff5fd6d4" (UID: "71255c4f-1e47-4e35-845f-876fff5fd6d4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368402 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368439 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz2pk\" (UniqueName: \"kubernetes.io/projected/71255c4f-1e47-4e35-845f-876fff5fd6d4-kube-api-access-pz2pk\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368454 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368466 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368504 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368517 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71255c4f-1e47-4e35-845f-876fff5fd6d4-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368528 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.368539 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71255c4f-1e47-4e35-845f-876fff5fd6d4-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.397991 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.398418 4775 scope.go:117] "RemoveContainer" containerID="b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.438452 4775 scope.go:117] "RemoveContainer" containerID="9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405" Mar 21 05:08:50 crc kubenswrapper[4775]: E0321 05:08:50.440943 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405\": container with ID starting with 9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405 not found: ID does not exist" containerID="9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.441001 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405"} err="failed to get container status \"9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405\": rpc error: code = NotFound desc = could not find container \"9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405\": container with ID starting with 9f447f1b0e6151bf34a0eec832692e899e0a001912e2cf06411e6494eecfb405 not found: ID does not exist" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.441035 4775 scope.go:117] "RemoveContainer" containerID="b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268" Mar 21 05:08:50 crc kubenswrapper[4775]: E0321 05:08:50.445572 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268\": container with ID starting with b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268 not found: ID does not exist" containerID="b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.445655 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268"} err="failed to get container status \"b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268\": rpc error: code = NotFound desc = could not find container \"b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268\": container with ID starting with b0bf1b828206099564568160f6163284ee71379761dd593fcfafe3139a339268 not found: ID does not exist" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.469967 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.607413 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.618390 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.644274 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:08:50 crc kubenswrapper[4775]: E0321 05:08:50.644697 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-httpd" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.644710 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-httpd" Mar 21 05:08:50 crc kubenswrapper[4775]: E0321 05:08:50.644725 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-log" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.644732 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-log" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.644934 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-log" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.644958 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" containerName="glance-httpd" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.646068 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.649252 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.649539 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.667573 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.721951 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.776733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.776880 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.776957 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.777109 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.777186 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.777220 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.777377 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftppj\" (UniqueName: \"kubernetes.io/projected/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-kube-api-access-ftppj\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.777670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879407 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879460 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879493 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879569 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879591 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftppj\" (UniqueName: \"kubernetes.io/projected/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-kube-api-access-ftppj\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879797 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.879789 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.880216 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.880265 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.884671 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.884702 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.885783 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.894543 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.902474 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftppj\" (UniqueName: \"kubernetes.io/projected/e9e72c4b-a3fb-41eb-974a-74d24d6cdac9-kube-api-access-ftppj\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.911051 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9\") " pod="openstack/glance-default-internal-api-0" Mar 21 05:08:50 crc kubenswrapper[4775]: I0321 05:08:50.964839 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 05:08:51 crc kubenswrapper[4775]: I0321 05:08:51.271254 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01638b90-5e17-43b3-a3b5-90726b26e243","Type":"ContainerStarted","Data":"306032f21837c9eaad68cb251fcdd340b028c66fc621cccbb829910f80e9b9eb"} Mar 21 05:08:51 crc kubenswrapper[4775]: I0321 05:08:51.561170 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 05:08:51 crc kubenswrapper[4775]: W0321 05:08:51.573226 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9e72c4b_a3fb_41eb_974a_74d24d6cdac9.slice/crio-43fe5d0cb79c2f9695f700da1f42f8761f2efb036bd157b8a381ccb30c2c5b5e WatchSource:0}: Error finding container 43fe5d0cb79c2f9695f700da1f42f8761f2efb036bd157b8a381ccb30c2c5b5e: Status 404 returned error can't find the container with id 43fe5d0cb79c2f9695f700da1f42f8761f2efb036bd157b8a381ccb30c2c5b5e Mar 21 05:08:51 crc kubenswrapper[4775]: I0321 05:08:51.678148 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71255c4f-1e47-4e35-845f-876fff5fd6d4" path="/var/lib/kubelet/pods/71255c4f-1e47-4e35-845f-876fff5fd6d4/volumes" Mar 21 05:08:52 crc kubenswrapper[4775]: I0321 05:08:52.293759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9","Type":"ContainerStarted","Data":"8c18a8a69790fe1df7223ccfc07aa1d02e91b4c205aeb964f24e898e96a99bd7"} Mar 21 05:08:52 crc kubenswrapper[4775]: I0321 05:08:52.294043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9","Type":"ContainerStarted","Data":"43fe5d0cb79c2f9695f700da1f42f8761f2efb036bd157b8a381ccb30c2c5b5e"} Mar 21 05:08:52 crc kubenswrapper[4775]: I0321 05:08:52.297391 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01638b90-5e17-43b3-a3b5-90726b26e243","Type":"ContainerStarted","Data":"bb11d5c8afbf3756020e0b603783b8b46872e5fea4262f38b6c80e6d01e2ae3c"} Mar 21 05:08:52 crc kubenswrapper[4775]: I0321 05:08:52.297437 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01638b90-5e17-43b3-a3b5-90726b26e243","Type":"ContainerStarted","Data":"e388a0ecedcb04f4424057bc2c07d6e3f4d796b885be7531b5e2d7a5759e1c9b"} Mar 21 05:08:52 crc kubenswrapper[4775]: I0321 05:08:52.326991 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.326967842 podStartE2EDuration="3.326967842s" podCreationTimestamp="2026-03-21 05:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:52.315447776 +0000 UTC m=+1285.291911400" watchObservedRunningTime="2026-03-21 05:08:52.326967842 +0000 UTC m=+1285.303431476" Mar 21 05:08:53 crc kubenswrapper[4775]: I0321 05:08:53.311228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9e72c4b-a3fb-41eb-974a-74d24d6cdac9","Type":"ContainerStarted","Data":"902eb75dc6f362171ab35925bd1c393028b1b3fbaa2d60e5676d0154648ce94f"} Mar 21 05:08:53 crc kubenswrapper[4775]: I0321 05:08:53.341160 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.341131693 podStartE2EDuration="3.341131693s" podCreationTimestamp="2026-03-21 05:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:53.33183995 +0000 UTC m=+1286.308303574" watchObservedRunningTime="2026-03-21 05:08:53.341131693 +0000 UTC m=+1286.317595317" Mar 21 05:08:54 crc kubenswrapper[4775]: I0321 05:08:54.322057 4775 generic.go:334] "Generic (PLEG): container finished" podID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerID="f3b6a0d856af07fa3386647a1df14c43107fa18d16407653836fe8f6805030db" exitCode=0 Mar 21 05:08:54 crc kubenswrapper[4775]: I0321 05:08:54.322111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerDied","Data":"f3b6a0d856af07fa3386647a1df14c43107fa18d16407653836fe8f6805030db"} Mar 21 05:08:54 crc kubenswrapper[4775]: I0321 05:08:54.323887 4775 generic.go:334] "Generic (PLEG): container finished" podID="824625b1-30cc-42c5-ad83-5854770c2f46" containerID="20d707fa7f70310d9cb80b150c04d331c2047a421ec95e9ba298af9cad00d19d" exitCode=0 Mar 21 05:08:54 crc kubenswrapper[4775]: I0321 05:08:54.323919 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gk78h" event={"ID":"824625b1-30cc-42c5-ad83-5854770c2f46","Type":"ContainerDied","Data":"20d707fa7f70310d9cb80b150c04d331c2047a421ec95e9ba298af9cad00d19d"} Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.702907 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.767994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-config-data\") pod \"824625b1-30cc-42c5-ad83-5854770c2f46\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.768192 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-combined-ca-bundle\") pod \"824625b1-30cc-42c5-ad83-5854770c2f46\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.768304 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-scripts\") pod \"824625b1-30cc-42c5-ad83-5854770c2f46\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.768338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59jrc\" (UniqueName: \"kubernetes.io/projected/824625b1-30cc-42c5-ad83-5854770c2f46-kube-api-access-59jrc\") pod \"824625b1-30cc-42c5-ad83-5854770c2f46\" (UID: \"824625b1-30cc-42c5-ad83-5854770c2f46\") " Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.774268 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-scripts" (OuterVolumeSpecName: "scripts") pod "824625b1-30cc-42c5-ad83-5854770c2f46" (UID: "824625b1-30cc-42c5-ad83-5854770c2f46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.774296 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824625b1-30cc-42c5-ad83-5854770c2f46-kube-api-access-59jrc" (OuterVolumeSpecName: "kube-api-access-59jrc") pod "824625b1-30cc-42c5-ad83-5854770c2f46" (UID: "824625b1-30cc-42c5-ad83-5854770c2f46"). InnerVolumeSpecName "kube-api-access-59jrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.809101 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "824625b1-30cc-42c5-ad83-5854770c2f46" (UID: "824625b1-30cc-42c5-ad83-5854770c2f46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.820992 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-config-data" (OuterVolumeSpecName: "config-data") pod "824625b1-30cc-42c5-ad83-5854770c2f46" (UID: "824625b1-30cc-42c5-ad83-5854770c2f46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.870894 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.870945 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.870961 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59jrc\" (UniqueName: \"kubernetes.io/projected/824625b1-30cc-42c5-ad83-5854770c2f46-kube-api-access-59jrc\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:55 crc kubenswrapper[4775]: I0321 05:08:55.870978 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824625b1-30cc-42c5-ad83-5854770c2f46-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.347622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gk78h" event={"ID":"824625b1-30cc-42c5-ad83-5854770c2f46","Type":"ContainerDied","Data":"2d77ca12f9d9586582c849b2cbdaffac8b8dcfcc0b7a10cdf251cde2b34e8f90"} Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.347661 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d77ca12f9d9586582c849b2cbdaffac8b8dcfcc0b7a10cdf251cde2b34e8f90" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.347669 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gk78h" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.448967 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 05:08:56 crc kubenswrapper[4775]: E0321 05:08:56.449513 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824625b1-30cc-42c5-ad83-5854770c2f46" containerName="nova-cell0-conductor-db-sync" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.449537 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="824625b1-30cc-42c5-ad83-5854770c2f46" containerName="nova-cell0-conductor-db-sync" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.449764 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="824625b1-30cc-42c5-ad83-5854770c2f46" containerName="nova-cell0-conductor-db-sync" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.450504 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.455853 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.456111 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bhr6z" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.467278 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.483933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nl54\" (UniqueName: \"kubernetes.io/projected/789a0bb8-b131-4144-9400-7c32a604d6d5-kube-api-access-5nl54\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.484021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789a0bb8-b131-4144-9400-7c32a604d6d5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.484164 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789a0bb8-b131-4144-9400-7c32a604d6d5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.585586 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nl54\" (UniqueName: \"kubernetes.io/projected/789a0bb8-b131-4144-9400-7c32a604d6d5-kube-api-access-5nl54\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.585680 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789a0bb8-b131-4144-9400-7c32a604d6d5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.585774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789a0bb8-b131-4144-9400-7c32a604d6d5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.590576 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/789a0bb8-b131-4144-9400-7c32a604d6d5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.591029 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/789a0bb8-b131-4144-9400-7c32a604d6d5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.604393 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nl54\" (UniqueName: \"kubernetes.io/projected/789a0bb8-b131-4144-9400-7c32a604d6d5-kube-api-access-5nl54\") pod \"nova-cell0-conductor-0\" (UID: \"789a0bb8-b131-4144-9400-7c32a604d6d5\") " pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:56 crc kubenswrapper[4775]: I0321 05:08:56.775339 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:57 crc kubenswrapper[4775]: I0321 05:08:57.285135 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 05:08:57 crc kubenswrapper[4775]: I0321 05:08:57.356343 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"789a0bb8-b131-4144-9400-7c32a604d6d5","Type":"ContainerStarted","Data":"55e62127f17f2e0706ae755a8c32b956ddd17f11e1e5e7eab7ea44e3cbd3b157"} Mar 21 05:08:58 crc kubenswrapper[4775]: I0321 05:08:58.365838 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"789a0bb8-b131-4144-9400-7c32a604d6d5","Type":"ContainerStarted","Data":"b784379bb60e5f18b33c1593dc0cd9eefd62b1907352006d148b3c310fb65252"} Mar 21 05:08:58 crc kubenswrapper[4775]: I0321 05:08:58.367361 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 21 05:08:58 crc kubenswrapper[4775]: I0321 05:08:58.396370 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.396351336 podStartE2EDuration="2.396351336s" podCreationTimestamp="2026-03-21 05:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:08:58.39155521 +0000 UTC m=+1291.368018844" watchObservedRunningTime="2026-03-21 05:08:58.396351336 +0000 UTC m=+1291.372814970" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.024464 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.024765 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.052256 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.067457 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.386741 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.386804 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.965880 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.965942 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:00 crc kubenswrapper[4775]: I0321 05:09:00.995632 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:01 crc kubenswrapper[4775]: I0321 05:09:01.018208 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:01 crc kubenswrapper[4775]: I0321 05:09:01.395070 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:01 crc kubenswrapper[4775]: I0321 05:09:01.395104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:02 crc kubenswrapper[4775]: I0321 05:09:02.312364 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:09:02 crc kubenswrapper[4775]: I0321 05:09:02.321032 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 05:09:03 crc kubenswrapper[4775]: I0321 05:09:03.345712 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:03 crc kubenswrapper[4775]: I0321 05:09:03.382695 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 05:09:06 crc kubenswrapper[4775]: I0321 05:09:06.803770 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.281219 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pckp8"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.282601 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.284601 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.284862 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.302286 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pckp8"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.402545 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbdhj\" (UniqueName: \"kubernetes.io/projected/5a407dd6-593f-4806-884b-8e031639d25d-kube-api-access-zbdhj\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.402664 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.402818 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-scripts\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.402911 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-config-data\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.504688 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.505058 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-scripts\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.505217 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-config-data\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.505345 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbdhj\" (UniqueName: \"kubernetes.io/projected/5a407dd6-593f-4806-884b-8e031639d25d-kube-api-access-zbdhj\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.515817 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-scripts\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.527190 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.528753 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.533455 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.534230 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-config-data\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.536107 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.555784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbdhj\" (UniqueName: \"kubernetes.io/projected/5a407dd6-593f-4806-884b-8e031639d25d-kube-api-access-zbdhj\") pod \"nova-cell0-cell-mapping-pckp8\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.570774 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.595314 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.599193 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.605762 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.606765 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.618980 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.698445 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.699867 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.705627 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715542 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715630 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7219652-7280-4a81-b501-b6e41673162b-logs\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715684 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9m8c\" (UniqueName: \"kubernetes.io/projected/d7219652-7280-4a81-b501-b6e41673162b-kube-api-access-g9m8c\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-config-data\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715744 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715768 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-config-data\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715841 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-config-data\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715884 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87dad904-a84b-4dcf-9caa-3fe89053b96b-logs\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.715923 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txpj5\" (UniqueName: \"kubernetes.io/projected/87dad904-a84b-4dcf-9caa-3fe89053b96b-kube-api-access-txpj5\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.716032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpl5g\" (UniqueName: \"kubernetes.io/projected/e114fa87-9e65-4a0b-bad1-81f521aeef85-kube-api-access-xpl5g\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.716076 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.730070 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.773870 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qsbbk"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.775672 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-config-data\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87dad904-a84b-4dcf-9caa-3fe89053b96b-logs\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820405 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvg9\" (UniqueName: \"kubernetes.io/projected/20a97ec2-24be-494f-a3e8-dc7d202021c3-kube-api-access-ptvg9\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txpj5\" (UniqueName: \"kubernetes.io/projected/87dad904-a84b-4dcf-9caa-3fe89053b96b-kube-api-access-txpj5\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-svc\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820548 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpl5g\" (UniqueName: \"kubernetes.io/projected/e114fa87-9e65-4a0b-bad1-81f521aeef85-kube-api-access-xpl5g\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820581 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820606 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820689 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-config\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7219652-7280-4a81-b501-b6e41673162b-logs\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820736 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820761 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9m8c\" (UniqueName: \"kubernetes.io/projected/d7219652-7280-4a81-b501-b6e41673162b-kube-api-access-g9m8c\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820788 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-config-data\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820845 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-config-data\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.820873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.823523 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7219652-7280-4a81-b501-b6e41673162b-logs\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.824406 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87dad904-a84b-4dcf-9caa-3fe89053b96b-logs\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.841845 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-config-data\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.841956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-config-data\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.842832 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.843937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-config-data\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.849818 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpl5g\" (UniqueName: \"kubernetes.io/projected/e114fa87-9e65-4a0b-bad1-81f521aeef85-kube-api-access-xpl5g\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.851162 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.859378 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.859376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9m8c\" (UniqueName: \"kubernetes.io/projected/d7219652-7280-4a81-b501-b6e41673162b-kube-api-access-g9m8c\") pod \"nova-metadata-0\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " pod="openstack/nova-metadata-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.859947 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txpj5\" (UniqueName: \"kubernetes.io/projected/87dad904-a84b-4dcf-9caa-3fe89053b96b-kube-api-access-txpj5\") pod \"nova-api-0\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.878198 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qsbbk"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.889217 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.890458 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.902328 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923606 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-svc\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923723 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swsh9\" (UniqueName: \"kubernetes.io/projected/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-kube-api-access-swsh9\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923760 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923852 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-config\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923883 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.923931 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.924002 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvg9\" (UniqueName: \"kubernetes.io/projected/20a97ec2-24be-494f-a3e8-dc7d202021c3-kube-api-access-ptvg9\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.925519 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-svc\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.927482 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.928361 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.931661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-config\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.936497 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.937604 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.950421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvg9\" (UniqueName: \"kubernetes.io/projected/20a97ec2-24be-494f-a3e8-dc7d202021c3-kube-api-access-ptvg9\") pod \"dnsmasq-dns-bccf8f775-qsbbk\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:07 crc kubenswrapper[4775]: I0321 05:09:07.956060 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.026300 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.029889 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.030205 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swsh9\" (UniqueName: \"kubernetes.io/projected/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-kube-api-access-swsh9\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.030275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.039104 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.045645 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.051916 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swsh9\" (UniqueName: \"kubernetes.io/projected/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-kube-api-access-swsh9\") pod \"nova-cell1-novncproxy-0\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.054406 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.102396 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.244813 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.314130 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pckp8"] Mar 21 05:09:08 crc kubenswrapper[4775]: W0321 05:09:08.318076 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a407dd6_593f_4806_884b_8e031639d25d.slice/crio-040855e862320088965a0b53c32aefe654235d67e7c2aec1b7da6cf02854cd90 WatchSource:0}: Error finding container 040855e862320088965a0b53c32aefe654235d67e7c2aec1b7da6cf02854cd90: Status 404 returned error can't find the container with id 040855e862320088965a0b53c32aefe654235d67e7c2aec1b7da6cf02854cd90 Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.494183 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pckp8" event={"ID":"5a407dd6-593f-4806-884b-8e031639d25d","Type":"ContainerStarted","Data":"040855e862320088965a0b53c32aefe654235d67e7c2aec1b7da6cf02854cd90"} Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.561424 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vd764"] Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.562475 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.571499 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.571693 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.662816 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-config-data\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.696321 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fmmn\" (UniqueName: \"kubernetes.io/projected/fb1dd6e9-8801-464e-952c-d345498b132a-kube-api-access-5fmmn\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.696607 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-scripts\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.696733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.743715 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vd764"] Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.810307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.810582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-config-data\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.810703 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fmmn\" (UniqueName: \"kubernetes.io/projected/fb1dd6e9-8801-464e-952c-d345498b132a-kube-api-access-5fmmn\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.810918 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-scripts\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.832742 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.833855 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.833924 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.862232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fmmn\" (UniqueName: \"kubernetes.io/projected/fb1dd6e9-8801-464e-952c-d345498b132a-kube-api-access-5fmmn\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.862352 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-config-data\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.864325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-scripts\") pod \"nova-cell1-conductor-db-sync-vd764\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.899327 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.964919 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:08 crc kubenswrapper[4775]: I0321 05:09:08.967414 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qsbbk"] Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.109429 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.491152 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vd764"] Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.524359 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pckp8" event={"ID":"5a407dd6-593f-4806-884b-8e031639d25d","Type":"ContainerStarted","Data":"650b9597595a69528b159834c598c6d1c2eb6858daf6ed558c1a6a03d2d450dd"} Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.537242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"87dad904-a84b-4dcf-9caa-3fe89053b96b","Type":"ContainerStarted","Data":"3c21cc3a28ac294006c3d112d32cbbbabe9d8f787077b90e376062aeec6dab3e"} Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.547978 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2","Type":"ContainerStarted","Data":"e7ffc178120764777a6e0c0217aa92bdaf948da3b3e7aaf4502f70ada80abfb2"} Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.555099 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e114fa87-9e65-4a0b-bad1-81f521aeef85","Type":"ContainerStarted","Data":"8a454c45ca71508ee3e69e89e31c30561bb964893c09c7b3c20372f820d152ee"} Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.557772 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7219652-7280-4a81-b501-b6e41673162b","Type":"ContainerStarted","Data":"c67a6d4fa6caed2b2522033053bcb1027513121c3bd46425c786741079d6ba00"} Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.566760 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pckp8" podStartSLOduration=2.566735796 podStartE2EDuration="2.566735796s" podCreationTimestamp="2026-03-21 05:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:09.539718342 +0000 UTC m=+1302.516181966" watchObservedRunningTime="2026-03-21 05:09:09.566735796 +0000 UTC m=+1302.543199420" Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.568184 4775 generic.go:334] "Generic (PLEG): container finished" podID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerID="992670c46bdc3ecfcf02d509f68efcc94de86d4d365af73eee5b378735e764a6" exitCode=0 Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.568206 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" event={"ID":"20a97ec2-24be-494f-a3e8-dc7d202021c3","Type":"ContainerDied","Data":"992670c46bdc3ecfcf02d509f68efcc94de86d4d365af73eee5b378735e764a6"} Mar 21 05:09:09 crc kubenswrapper[4775]: I0321 05:09:09.569683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" event={"ID":"20a97ec2-24be-494f-a3e8-dc7d202021c3","Type":"ContainerStarted","Data":"9a2e5054b120d9d9b114985bfe9a3a6397285644c20c24699a021ae58fcb7c61"} Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.581141 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" event={"ID":"20a97ec2-24be-494f-a3e8-dc7d202021c3","Type":"ContainerStarted","Data":"210c52f72b30e6d06b1636b7f4ba7c385a206b219bca52c176d9da81b984a93a"} Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.581483 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.587261 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vd764" event={"ID":"fb1dd6e9-8801-464e-952c-d345498b132a","Type":"ContainerStarted","Data":"166965bd5533ce39880e58f576322a3fbe5ebbddcee7067e02f0a62bbadf6013"} Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.587308 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vd764" event={"ID":"fb1dd6e9-8801-464e-952c-d345498b132a","Type":"ContainerStarted","Data":"5e30087cecb9bbd6f3df2a364239f70f52ef2de20ec9c4dfd84fb90814b44306"} Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.614759 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" podStartSLOduration=3.614736293 podStartE2EDuration="3.614736293s" podCreationTimestamp="2026-03-21 05:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:10.60507384 +0000 UTC m=+1303.581537474" watchObservedRunningTime="2026-03-21 05:09:10.614736293 +0000 UTC m=+1303.591199917" Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.627065 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-vd764" podStartSLOduration=2.627041451 podStartE2EDuration="2.627041451s" podCreationTimestamp="2026-03-21 05:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:10.623430929 +0000 UTC m=+1303.599894563" watchObservedRunningTime="2026-03-21 05:09:10.627041451 +0000 UTC m=+1303.603505095" Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.938859 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:10 crc kubenswrapper[4775]: I0321 05:09:10.948005 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.483291 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.620228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2","Type":"ContainerStarted","Data":"bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa"} Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.620370 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa" gracePeriod=30 Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.626737 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e114fa87-9e65-4a0b-bad1-81f521aeef85","Type":"ContainerStarted","Data":"4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a"} Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.630045 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7219652-7280-4a81-b501-b6e41673162b","Type":"ContainerStarted","Data":"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb"} Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.630081 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7219652-7280-4a81-b501-b6e41673162b","Type":"ContainerStarted","Data":"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433"} Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.630213 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-log" containerID="cri-o://875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433" gracePeriod=30 Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.630497 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-metadata" containerID="cri-o://2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb" gracePeriod=30 Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.638720 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"87dad904-a84b-4dcf-9caa-3fe89053b96b","Type":"ContainerStarted","Data":"3d41a9111ff02a22981deef2e158c7aaae04c761a3f477f9cde2ecff8054c0f1"} Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.638771 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"87dad904-a84b-4dcf-9caa-3fe89053b96b","Type":"ContainerStarted","Data":"a3e8af6598015c259cd8e37d94c33e2e14ca3a07bdbbfc459894647fdb9e40ea"} Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.670734 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.061585605 podStartE2EDuration="6.670707805s" podCreationTimestamp="2026-03-21 05:09:07 +0000 UTC" firstStartedPulling="2026-03-21 05:09:09.119527053 +0000 UTC m=+1302.095990677" lastFinishedPulling="2026-03-21 05:09:12.728649253 +0000 UTC m=+1305.705112877" observedRunningTime="2026-03-21 05:09:13.668329987 +0000 UTC m=+1306.644793611" watchObservedRunningTime="2026-03-21 05:09:13.670707805 +0000 UTC m=+1306.647171429" Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.692981 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8407378420000002 podStartE2EDuration="6.692964484s" podCreationTimestamp="2026-03-21 05:09:07 +0000 UTC" firstStartedPulling="2026-03-21 05:09:08.882984236 +0000 UTC m=+1301.859447860" lastFinishedPulling="2026-03-21 05:09:12.735210878 +0000 UTC m=+1305.711674502" observedRunningTime="2026-03-21 05:09:13.690022971 +0000 UTC m=+1306.666486595" watchObservedRunningTime="2026-03-21 05:09:13.692964484 +0000 UTC m=+1306.669428108" Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.712175 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.713941068 podStartE2EDuration="6.712159547s" podCreationTimestamp="2026-03-21 05:09:07 +0000 UTC" firstStartedPulling="2026-03-21 05:09:08.730365342 +0000 UTC m=+1301.706828966" lastFinishedPulling="2026-03-21 05:09:12.728583821 +0000 UTC m=+1305.705047445" observedRunningTime="2026-03-21 05:09:13.710271423 +0000 UTC m=+1306.686735047" watchObservedRunningTime="2026-03-21 05:09:13.712159547 +0000 UTC m=+1306.688623171" Mar 21 05:09:13 crc kubenswrapper[4775]: I0321 05:09:13.734971 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8688467060000002 podStartE2EDuration="6.734952871s" podCreationTimestamp="2026-03-21 05:09:07 +0000 UTC" firstStartedPulling="2026-03-21 05:09:08.862478946 +0000 UTC m=+1301.838942570" lastFinishedPulling="2026-03-21 05:09:12.728585111 +0000 UTC m=+1305.705048735" observedRunningTime="2026-03-21 05:09:13.734633502 +0000 UTC m=+1306.711097126" watchObservedRunningTime="2026-03-21 05:09:13.734952871 +0000 UTC m=+1306.711416495" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.609754 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.658476 4775 generic.go:334] "Generic (PLEG): container finished" podID="d7219652-7280-4a81-b501-b6e41673162b" containerID="2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb" exitCode=0 Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.658814 4775 generic.go:334] "Generic (PLEG): container finished" podID="d7219652-7280-4a81-b501-b6e41673162b" containerID="875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433" exitCode=143 Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.658565 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.658578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7219652-7280-4a81-b501-b6e41673162b","Type":"ContainerDied","Data":"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb"} Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.658921 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7219652-7280-4a81-b501-b6e41673162b","Type":"ContainerDied","Data":"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433"} Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.658935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d7219652-7280-4a81-b501-b6e41673162b","Type":"ContainerDied","Data":"c67a6d4fa6caed2b2522033053bcb1027513121c3bd46425c786741079d6ba00"} Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.658952 4775 scope.go:117] "RemoveContainer" containerID="2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.664789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7219652-7280-4a81-b501-b6e41673162b-logs\") pod \"d7219652-7280-4a81-b501-b6e41673162b\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.664899 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9m8c\" (UniqueName: \"kubernetes.io/projected/d7219652-7280-4a81-b501-b6e41673162b-kube-api-access-g9m8c\") pod \"d7219652-7280-4a81-b501-b6e41673162b\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.665038 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-combined-ca-bundle\") pod \"d7219652-7280-4a81-b501-b6e41673162b\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.665064 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-config-data\") pod \"d7219652-7280-4a81-b501-b6e41673162b\" (UID: \"d7219652-7280-4a81-b501-b6e41673162b\") " Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.665453 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7219652-7280-4a81-b501-b6e41673162b-logs" (OuterVolumeSpecName: "logs") pod "d7219652-7280-4a81-b501-b6e41673162b" (UID: "d7219652-7280-4a81-b501-b6e41673162b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.666245 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7219652-7280-4a81-b501-b6e41673162b-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.674598 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7219652-7280-4a81-b501-b6e41673162b-kube-api-access-g9m8c" (OuterVolumeSpecName: "kube-api-access-g9m8c") pod "d7219652-7280-4a81-b501-b6e41673162b" (UID: "d7219652-7280-4a81-b501-b6e41673162b"). InnerVolumeSpecName "kube-api-access-g9m8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.687862 4775 scope.go:117] "RemoveContainer" containerID="875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.698307 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-config-data" (OuterVolumeSpecName: "config-data") pod "d7219652-7280-4a81-b501-b6e41673162b" (UID: "d7219652-7280-4a81-b501-b6e41673162b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.706291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7219652-7280-4a81-b501-b6e41673162b" (UID: "d7219652-7280-4a81-b501-b6e41673162b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.715790 4775 scope.go:117] "RemoveContainer" containerID="2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb" Mar 21 05:09:14 crc kubenswrapper[4775]: E0321 05:09:14.716262 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb\": container with ID starting with 2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb not found: ID does not exist" containerID="2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.716320 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb"} err="failed to get container status \"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb\": rpc error: code = NotFound desc = could not find container \"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb\": container with ID starting with 2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb not found: ID does not exist" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.716359 4775 scope.go:117] "RemoveContainer" containerID="875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433" Mar 21 05:09:14 crc kubenswrapper[4775]: E0321 05:09:14.716760 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433\": container with ID starting with 875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433 not found: ID does not exist" containerID="875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.716817 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433"} err="failed to get container status \"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433\": rpc error: code = NotFound desc = could not find container \"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433\": container with ID starting with 875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433 not found: ID does not exist" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.716848 4775 scope.go:117] "RemoveContainer" containerID="2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.717378 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb"} err="failed to get container status \"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb\": rpc error: code = NotFound desc = could not find container \"2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb\": container with ID starting with 2076143ea51df5acf1912050c6b923d311867381f6ad9a4e7c31e5ba00ded2cb not found: ID does not exist" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.717408 4775 scope.go:117] "RemoveContainer" containerID="875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.717854 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433"} err="failed to get container status \"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433\": rpc error: code = NotFound desc = could not find container \"875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433\": container with ID starting with 875af4c853efd2be80431c69406f831a833e919fb5a28d0763b186677c3a8433 not found: ID does not exist" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.768224 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9m8c\" (UniqueName: \"kubernetes.io/projected/d7219652-7280-4a81-b501-b6e41673162b-kube-api-access-g9m8c\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.768255 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:14 crc kubenswrapper[4775]: I0321 05:09:14.768266 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7219652-7280-4a81-b501-b6e41673162b-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.011443 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.024189 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.036181 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:15 crc kubenswrapper[4775]: E0321 05:09:15.036637 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-metadata" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.036663 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-metadata" Mar 21 05:09:15 crc kubenswrapper[4775]: E0321 05:09:15.036768 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-log" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.036779 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-log" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.036973 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-metadata" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.037000 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7219652-7280-4a81-b501-b6e41673162b" containerName="nova-metadata-log" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.038015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.040805 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.042443 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.067907 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.073381 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.073643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82600d19-b64e-4737-8a1f-ff1289daa3ed-logs\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.073823 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-config-data\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.073882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsw75\" (UniqueName: \"kubernetes.io/projected/82600d19-b64e-4737-8a1f-ff1289daa3ed-kube-api-access-jsw75\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.073906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.180111 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.180199 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.180274 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82600d19-b64e-4737-8a1f-ff1289daa3ed-logs\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.180332 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-config-data\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.180362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsw75\" (UniqueName: \"kubernetes.io/projected/82600d19-b64e-4737-8a1f-ff1289daa3ed-kube-api-access-jsw75\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.180872 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82600d19-b64e-4737-8a1f-ff1289daa3ed-logs\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.184052 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.184192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.198834 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsw75\" (UniqueName: \"kubernetes.io/projected/82600d19-b64e-4737-8a1f-ff1289daa3ed-kube-api-access-jsw75\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.198942 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-config-data\") pod \"nova-metadata-0\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.356562 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.677526 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7219652-7280-4a81-b501-b6e41673162b" path="/var/lib/kubelet/pods/d7219652-7280-4a81-b501-b6e41673162b/volumes" Mar 21 05:09:15 crc kubenswrapper[4775]: I0321 05:09:15.830842 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:16 crc kubenswrapper[4775]: I0321 05:09:16.690205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82600d19-b64e-4737-8a1f-ff1289daa3ed","Type":"ContainerStarted","Data":"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06"} Mar 21 05:09:16 crc kubenswrapper[4775]: I0321 05:09:16.690577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82600d19-b64e-4737-8a1f-ff1289daa3ed","Type":"ContainerStarted","Data":"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b"} Mar 21 05:09:16 crc kubenswrapper[4775]: I0321 05:09:16.690599 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82600d19-b64e-4737-8a1f-ff1289daa3ed","Type":"ContainerStarted","Data":"1aef16226ce61b14de14608541f12818545fa3117a238adb67e56a57bcd74341"} Mar 21 05:09:16 crc kubenswrapper[4775]: I0321 05:09:16.741525 4775 generic.go:334] "Generic (PLEG): container finished" podID="5a407dd6-593f-4806-884b-8e031639d25d" containerID="650b9597595a69528b159834c598c6d1c2eb6858daf6ed558c1a6a03d2d450dd" exitCode=0 Mar 21 05:09:16 crc kubenswrapper[4775]: I0321 05:09:16.741591 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pckp8" event={"ID":"5a407dd6-593f-4806-884b-8e031639d25d","Type":"ContainerDied","Data":"650b9597595a69528b159834c598c6d1c2eb6858daf6ed558c1a6a03d2d450dd"} Mar 21 05:09:16 crc kubenswrapper[4775]: I0321 05:09:16.797687 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.797665573 podStartE2EDuration="1.797665573s" podCreationTimestamp="2026-03-21 05:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:16.762161729 +0000 UTC m=+1309.738625373" watchObservedRunningTime="2026-03-21 05:09:16.797665573 +0000 UTC m=+1309.774129197" Mar 21 05:09:17 crc kubenswrapper[4775]: I0321 05:09:17.937427 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:09:17 crc kubenswrapper[4775]: I0321 05:09:17.937872 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.055245 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.055313 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.086823 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.104369 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.115137 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.138468 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-scripts\") pod \"5a407dd6-593f-4806-884b-8e031639d25d\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.138597 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-config-data\") pod \"5a407dd6-593f-4806-884b-8e031639d25d\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.138647 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbdhj\" (UniqueName: \"kubernetes.io/projected/5a407dd6-593f-4806-884b-8e031639d25d-kube-api-access-zbdhj\") pod \"5a407dd6-593f-4806-884b-8e031639d25d\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.138722 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-combined-ca-bundle\") pod \"5a407dd6-593f-4806-884b-8e031639d25d\" (UID: \"5a407dd6-593f-4806-884b-8e031639d25d\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.160277 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a407dd6-593f-4806-884b-8e031639d25d-kube-api-access-zbdhj" (OuterVolumeSpecName: "kube-api-access-zbdhj") pod "5a407dd6-593f-4806-884b-8e031639d25d" (UID: "5a407dd6-593f-4806-884b-8e031639d25d"). InnerVolumeSpecName "kube-api-access-zbdhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.190358 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-scripts" (OuterVolumeSpecName: "scripts") pod "5a407dd6-593f-4806-884b-8e031639d25d" (UID: "5a407dd6-593f-4806-884b-8e031639d25d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.205579 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a407dd6-593f-4806-884b-8e031639d25d" (UID: "5a407dd6-593f-4806-884b-8e031639d25d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.224239 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-config-data" (OuterVolumeSpecName: "config-data") pod "5a407dd6-593f-4806-884b-8e031639d25d" (UID: "5a407dd6-593f-4806-884b-8e031639d25d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.224606 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w2ltw"] Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.225080 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" podUID="d7dc929b-501f-43cb-8811-394841cc54f9" containerName="dnsmasq-dns" containerID="cri-o://07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e" gracePeriod=10 Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.242639 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.242699 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbdhj\" (UniqueName: \"kubernetes.io/projected/5a407dd6-593f-4806-884b-8e031639d25d-kube-api-access-zbdhj\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.242715 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.242725 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a407dd6-593f-4806-884b-8e031639d25d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.246107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.688672 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.752908 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-sb\") pod \"d7dc929b-501f-43cb-8811-394841cc54f9\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.753171 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbc62\" (UniqueName: \"kubernetes.io/projected/d7dc929b-501f-43cb-8811-394841cc54f9-kube-api-access-tbc62\") pod \"d7dc929b-501f-43cb-8811-394841cc54f9\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.753417 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-svc\") pod \"d7dc929b-501f-43cb-8811-394841cc54f9\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.753450 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-swift-storage-0\") pod \"d7dc929b-501f-43cb-8811-394841cc54f9\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.753650 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-nb\") pod \"d7dc929b-501f-43cb-8811-394841cc54f9\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.753802 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-config\") pod \"d7dc929b-501f-43cb-8811-394841cc54f9\" (UID: \"d7dc929b-501f-43cb-8811-394841cc54f9\") " Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.766942 4775 generic.go:334] "Generic (PLEG): container finished" podID="d7dc929b-501f-43cb-8811-394841cc54f9" containerID="07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e" exitCode=0 Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.767060 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" event={"ID":"d7dc929b-501f-43cb-8811-394841cc54f9","Type":"ContainerDied","Data":"07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e"} Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.767102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" event={"ID":"d7dc929b-501f-43cb-8811-394841cc54f9","Type":"ContainerDied","Data":"4431b638594d2b34fefc401fb2d4af3e48aa80179d99ea9d13c2365707189fa3"} Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.767144 4775 scope.go:117] "RemoveContainer" containerID="07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.767630 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w2ltw" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.776825 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pckp8" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.776966 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7dc929b-501f-43cb-8811-394841cc54f9-kube-api-access-tbc62" (OuterVolumeSpecName: "kube-api-access-tbc62") pod "d7dc929b-501f-43cb-8811-394841cc54f9" (UID: "d7dc929b-501f-43cb-8811-394841cc54f9"). InnerVolumeSpecName "kube-api-access-tbc62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.777027 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pckp8" event={"ID":"5a407dd6-593f-4806-884b-8e031639d25d","Type":"ContainerDied","Data":"040855e862320088965a0b53c32aefe654235d67e7c2aec1b7da6cf02854cd90"} Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.777070 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="040855e862320088965a0b53c32aefe654235d67e7c2aec1b7da6cf02854cd90" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.797581 4775 scope.go:117] "RemoveContainer" containerID="6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.826489 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-config" (OuterVolumeSpecName: "config") pod "d7dc929b-501f-43cb-8811-394841cc54f9" (UID: "d7dc929b-501f-43cb-8811-394841cc54f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.847543 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7dc929b-501f-43cb-8811-394841cc54f9" (UID: "d7dc929b-501f-43cb-8811-394841cc54f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.850829 4775 scope.go:117] "RemoveContainer" containerID="07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.851938 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 05:09:18 crc kubenswrapper[4775]: E0321 05:09:18.859907 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e\": container with ID starting with 07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e not found: ID does not exist" containerID="07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.859951 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e"} err="failed to get container status \"07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e\": rpc error: code = NotFound desc = could not find container \"07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e\": container with ID starting with 07d83e49b0e806d759a2ee326367b45aab2ed5d87b3ee996d3b75ec5f20c736e not found: ID does not exist" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.859974 4775 scope.go:117] "RemoveContainer" containerID="6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.860318 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbc62\" (UniqueName: \"kubernetes.io/projected/d7dc929b-501f-43cb-8811-394841cc54f9-kube-api-access-tbc62\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.860364 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.860379 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: E0321 05:09:18.860533 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25\": container with ID starting with 6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25 not found: ID does not exist" containerID="6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.860556 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25"} err="failed to get container status \"6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25\": rpc error: code = NotFound desc = could not find container \"6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25\": container with ID starting with 6ab789fb8fabdc5b2c75b49b35b369108626a6a98dbf396798f603e25b157d25 not found: ID does not exist" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.860580 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7dc929b-501f-43cb-8811-394841cc54f9" (UID: "d7dc929b-501f-43cb-8811-394841cc54f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.863355 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7dc929b-501f-43cb-8811-394841cc54f9" (UID: "d7dc929b-501f-43cb-8811-394841cc54f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.865263 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7dc929b-501f-43cb-8811-394841cc54f9" (UID: "d7dc929b-501f-43cb-8811-394841cc54f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.938745 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.952531 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.952790 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-log" containerID="cri-o://a3e8af6598015c259cd8e37d94c33e2e14ca3a07bdbbfc459894647fdb9e40ea" gracePeriod=30 Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.952938 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-api" containerID="cri-o://3d41a9111ff02a22981deef2e158c7aaae04c761a3f477f9cde2ecff8054c0f1" gracePeriod=30 Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.961903 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.961941 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.961952 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7dc929b-501f-43cb-8811-394841cc54f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.963722 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.963888 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.964041 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.964114 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-log" containerID="cri-o://2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b" gracePeriod=30 Mar 21 05:09:18 crc kubenswrapper[4775]: I0321 05:09:18.964318 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-metadata" containerID="cri-o://a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06" gracePeriod=30 Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.131194 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w2ltw"] Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.138536 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w2ltw"] Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.685879 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7dc929b-501f-43cb-8811-394841cc54f9" path="/var/lib/kubelet/pods/d7dc929b-501f-43cb-8811-394841cc54f9/volumes" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.739430 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.794397 4775 generic.go:334] "Generic (PLEG): container finished" podID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerID="a3e8af6598015c259cd8e37d94c33e2e14ca3a07bdbbfc459894647fdb9e40ea" exitCode=143 Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.794460 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"87dad904-a84b-4dcf-9caa-3fe89053b96b","Type":"ContainerDied","Data":"a3e8af6598015c259cd8e37d94c33e2e14ca3a07bdbbfc459894647fdb9e40ea"} Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.797499 4775 generic.go:334] "Generic (PLEG): container finished" podID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerID="a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06" exitCode=0 Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.797518 4775 generic.go:334] "Generic (PLEG): container finished" podID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerID="2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b" exitCode=143 Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.797555 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82600d19-b64e-4737-8a1f-ff1289daa3ed","Type":"ContainerDied","Data":"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06"} Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.797577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82600d19-b64e-4737-8a1f-ff1289daa3ed","Type":"ContainerDied","Data":"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b"} Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.797592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"82600d19-b64e-4737-8a1f-ff1289daa3ed","Type":"ContainerDied","Data":"1aef16226ce61b14de14608541f12818545fa3117a238adb67e56a57bcd74341"} Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.797614 4775 scope.go:117] "RemoveContainer" containerID="a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.797923 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.798833 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.803684 4775 generic.go:334] "Generic (PLEG): container finished" podID="fb1dd6e9-8801-464e-952c-d345498b132a" containerID="166965bd5533ce39880e58f576322a3fbe5ebbddcee7067e02f0a62bbadf6013" exitCode=0 Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.803738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vd764" event={"ID":"fb1dd6e9-8801-464e-952c-d345498b132a","Type":"ContainerDied","Data":"166965bd5533ce39880e58f576322a3fbe5ebbddcee7067e02f0a62bbadf6013"} Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.809241 4775 generic.go:334] "Generic (PLEG): container finished" podID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerID="30455e621655fa15c45c8fad2a223bc59355675a9f5b3da03f27d5155ab96f2d" exitCode=137 Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.810011 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.810212 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab91b8d2-32c9-4542-a730-7fc84e9892bd","Type":"ContainerDied","Data":"30455e621655fa15c45c8fad2a223bc59355675a9f5b3da03f27d5155ab96f2d"} Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.837696 4775 scope.go:117] "RemoveContainer" containerID="2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.866852 4775 scope.go:117] "RemoveContainer" containerID="a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06" Mar 21 05:09:19 crc kubenswrapper[4775]: E0321 05:09:19.869592 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06\": container with ID starting with a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06 not found: ID does not exist" containerID="a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.869728 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06"} err="failed to get container status \"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06\": rpc error: code = NotFound desc = could not find container \"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06\": container with ID starting with a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06 not found: ID does not exist" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.869834 4775 scope.go:117] "RemoveContainer" containerID="2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b" Mar 21 05:09:19 crc kubenswrapper[4775]: E0321 05:09:19.870397 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b\": container with ID starting with 2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b not found: ID does not exist" containerID="2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.870444 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b"} err="failed to get container status \"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b\": rpc error: code = NotFound desc = could not find container \"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b\": container with ID starting with 2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b not found: ID does not exist" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.870467 4775 scope.go:117] "RemoveContainer" containerID="a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.871425 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06"} err="failed to get container status \"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06\": rpc error: code = NotFound desc = could not find container \"a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06\": container with ID starting with a9e328b0de38c9a70bb3528a6b0dd0685f8454aa009ccc97d323977d2e868a06 not found: ID does not exist" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.871458 4775 scope.go:117] "RemoveContainer" containerID="2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.872777 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b"} err="failed to get container status \"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b\": rpc error: code = NotFound desc = could not find container \"2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b\": container with ID starting with 2cc3d35e715a6db1c79daff232799e16bf15b4a6f8ae63e1656332d4536a706b not found: ID does not exist" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.872802 4775 scope.go:117] "RemoveContainer" containerID="30455e621655fa15c45c8fad2a223bc59355675a9f5b3da03f27d5155ab96f2d" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.893849 4775 scope.go:117] "RemoveContainer" containerID="756b7ac25394ff7e03c954f4dded0675f6295888345b0b7a098c3ad32426a8ef" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.904190 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsw75\" (UniqueName: \"kubernetes.io/projected/82600d19-b64e-4737-8a1f-ff1289daa3ed-kube-api-access-jsw75\") pod \"82600d19-b64e-4737-8a1f-ff1289daa3ed\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.905235 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-combined-ca-bundle\") pod \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.905334 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-config-data\") pod \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.905902 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-scripts\") pod \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906015 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-sg-core-conf-yaml\") pod \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906063 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-config-data\") pod \"82600d19-b64e-4737-8a1f-ff1289daa3ed\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-nova-metadata-tls-certs\") pod \"82600d19-b64e-4737-8a1f-ff1289daa3ed\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906145 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82600d19-b64e-4737-8a1f-ff1289daa3ed-logs\") pod \"82600d19-b64e-4737-8a1f-ff1289daa3ed\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906190 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-combined-ca-bundle\") pod \"82600d19-b64e-4737-8a1f-ff1289daa3ed\" (UID: \"82600d19-b64e-4737-8a1f-ff1289daa3ed\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906214 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-log-httpd\") pod \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906234 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-run-httpd\") pod \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.906269 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4msqv\" (UniqueName: \"kubernetes.io/projected/ab91b8d2-32c9-4542-a730-7fc84e9892bd-kube-api-access-4msqv\") pod \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\" (UID: \"ab91b8d2-32c9-4542-a730-7fc84e9892bd\") " Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.907034 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82600d19-b64e-4737-8a1f-ff1289daa3ed-logs" (OuterVolumeSpecName: "logs") pod "82600d19-b64e-4737-8a1f-ff1289daa3ed" (UID: "82600d19-b64e-4737-8a1f-ff1289daa3ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.907410 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82600d19-b64e-4737-8a1f-ff1289daa3ed-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.907681 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ab91b8d2-32c9-4542-a730-7fc84e9892bd" (UID: "ab91b8d2-32c9-4542-a730-7fc84e9892bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.907739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ab91b8d2-32c9-4542-a730-7fc84e9892bd" (UID: "ab91b8d2-32c9-4542-a730-7fc84e9892bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.912559 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82600d19-b64e-4737-8a1f-ff1289daa3ed-kube-api-access-jsw75" (OuterVolumeSpecName: "kube-api-access-jsw75") pod "82600d19-b64e-4737-8a1f-ff1289daa3ed" (UID: "82600d19-b64e-4737-8a1f-ff1289daa3ed"). InnerVolumeSpecName "kube-api-access-jsw75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.915462 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-scripts" (OuterVolumeSpecName: "scripts") pod "ab91b8d2-32c9-4542-a730-7fc84e9892bd" (UID: "ab91b8d2-32c9-4542-a730-7fc84e9892bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.924562 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab91b8d2-32c9-4542-a730-7fc84e9892bd-kube-api-access-4msqv" (OuterVolumeSpecName: "kube-api-access-4msqv") pod "ab91b8d2-32c9-4542-a730-7fc84e9892bd" (UID: "ab91b8d2-32c9-4542-a730-7fc84e9892bd"). InnerVolumeSpecName "kube-api-access-4msqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.931256 4775 scope.go:117] "RemoveContainer" containerID="8cc8d28938a70addf2d6412cf322bdf56956d1adec39274fbeadbaacd685266d" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.952404 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-config-data" (OuterVolumeSpecName: "config-data") pod "82600d19-b64e-4737-8a1f-ff1289daa3ed" (UID: "82600d19-b64e-4737-8a1f-ff1289daa3ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.959362 4775 scope.go:117] "RemoveContainer" containerID="f3b6a0d856af07fa3386647a1df14c43107fa18d16407653836fe8f6805030db" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.960608 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82600d19-b64e-4737-8a1f-ff1289daa3ed" (UID: "82600d19-b64e-4737-8a1f-ff1289daa3ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.961156 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ab91b8d2-32c9-4542-a730-7fc84e9892bd" (UID: "ab91b8d2-32c9-4542-a730-7fc84e9892bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:19 crc kubenswrapper[4775]: I0321 05:09:19.976299 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "82600d19-b64e-4737-8a1f-ff1289daa3ed" (UID: "82600d19-b64e-4737-8a1f-ff1289daa3ed"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008702 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008730 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008740 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008751 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82600d19-b64e-4737-8a1f-ff1289daa3ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008758 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008768 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab91b8d2-32c9-4542-a730-7fc84e9892bd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008777 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4msqv\" (UniqueName: \"kubernetes.io/projected/ab91b8d2-32c9-4542-a730-7fc84e9892bd-kube-api-access-4msqv\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008785 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsw75\" (UniqueName: \"kubernetes.io/projected/82600d19-b64e-4737-8a1f-ff1289daa3ed-kube-api-access-jsw75\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.008793 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.034294 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-config-data" (OuterVolumeSpecName: "config-data") pod "ab91b8d2-32c9-4542-a730-7fc84e9892bd" (UID: "ab91b8d2-32c9-4542-a730-7fc84e9892bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.038358 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab91b8d2-32c9-4542-a730-7fc84e9892bd" (UID: "ab91b8d2-32c9-4542-a730-7fc84e9892bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.114082 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.114144 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab91b8d2-32c9-4542-a730-7fc84e9892bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.166225 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.178753 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.202078 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.214074 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.225535 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.227177 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-notification-agent" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.227259 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-notification-agent" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.227337 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-central-agent" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.227395 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-central-agent" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.227452 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-metadata" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.227507 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-metadata" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.227568 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="proxy-httpd" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.227620 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="proxy-httpd" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.227672 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a407dd6-593f-4806-884b-8e031639d25d" containerName="nova-manage" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.227725 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a407dd6-593f-4806-884b-8e031639d25d" containerName="nova-manage" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.227797 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dc929b-501f-43cb-8811-394841cc54f9" containerName="init" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.227863 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dc929b-501f-43cb-8811-394841cc54f9" containerName="init" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.227943 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="sg-core" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228008 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="sg-core" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.228072 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7dc929b-501f-43cb-8811-394841cc54f9" containerName="dnsmasq-dns" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228177 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7dc929b-501f-43cb-8811-394841cc54f9" containerName="dnsmasq-dns" Mar 21 05:09:20 crc kubenswrapper[4775]: E0321 05:09:20.228242 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-log" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228295 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-log" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228528 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-central-agent" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228602 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-metadata" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228661 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="proxy-httpd" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228727 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a407dd6-593f-4806-884b-8e031639d25d" containerName="nova-manage" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228785 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" containerName="nova-metadata-log" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228839 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="ceilometer-notification-agent" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228893 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7dc929b-501f-43cb-8811-394841cc54f9" containerName="dnsmasq-dns" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.228942 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" containerName="sg-core" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.230948 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.236541 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.236759 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.240272 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.253574 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.257035 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.257366 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.285442 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.292831 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.316514 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/636228b7-669b-4b5d-abec-bf78cb1513f0-logs\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.316569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-config-data\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.316666 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-log-httpd\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.316701 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.316875 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-run-httpd\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.316921 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.317007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.317146 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.317180 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-scripts\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.317224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plq2h\" (UniqueName: \"kubernetes.io/projected/636228b7-669b-4b5d-abec-bf78cb1513f0-kube-api-access-plq2h\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.317262 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsrdg\" (UniqueName: \"kubernetes.io/projected/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-kube-api-access-zsrdg\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.317287 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-config-data\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419199 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-run-httpd\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419261 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419343 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419393 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-scripts\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419425 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plq2h\" (UniqueName: \"kubernetes.io/projected/636228b7-669b-4b5d-abec-bf78cb1513f0-kube-api-access-plq2h\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419449 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsrdg\" (UniqueName: \"kubernetes.io/projected/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-kube-api-access-zsrdg\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419467 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-config-data\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/636228b7-669b-4b5d-abec-bf78cb1513f0-logs\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-config-data\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419537 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-log-httpd\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.419670 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-run-httpd\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.420162 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/636228b7-669b-4b5d-abec-bf78cb1513f0-logs\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.420788 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-log-httpd\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.422488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.423090 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.428263 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.428804 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-scripts\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.434101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-config-data\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.444738 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.454761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plq2h\" (UniqueName: \"kubernetes.io/projected/636228b7-669b-4b5d-abec-bf78cb1513f0-kube-api-access-plq2h\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.455063 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-config-data\") pod \"nova-metadata-0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.465061 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsrdg\" (UniqueName: \"kubernetes.io/projected/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-kube-api-access-zsrdg\") pod \"ceilometer-0\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.567014 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.582839 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:09:20 crc kubenswrapper[4775]: I0321 05:09:20.841010 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e114fa87-9e65-4a0b-bad1-81f521aeef85" containerName="nova-scheduler-scheduler" containerID="cri-o://4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a" gracePeriod=30 Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.081459 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.197796 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:09:21 crc kubenswrapper[4775]: W0321 05:09:21.201032 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636228b7_669b_4b5d_abec_bf78cb1513f0.slice/crio-c6bcff38ed2ee79c0028a43362b2200b49c362342fd98956493476b35ddff899 WatchSource:0}: Error finding container c6bcff38ed2ee79c0028a43362b2200b49c362342fd98956493476b35ddff899: Status 404 returned error can't find the container with id c6bcff38ed2ee79c0028a43362b2200b49c362342fd98956493476b35ddff899 Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.338212 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.372460 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fmmn\" (UniqueName: \"kubernetes.io/projected/fb1dd6e9-8801-464e-952c-d345498b132a-kube-api-access-5fmmn\") pod \"fb1dd6e9-8801-464e-952c-d345498b132a\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.372593 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-scripts\") pod \"fb1dd6e9-8801-464e-952c-d345498b132a\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.372643 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-config-data\") pod \"fb1dd6e9-8801-464e-952c-d345498b132a\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.372668 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-combined-ca-bundle\") pod \"fb1dd6e9-8801-464e-952c-d345498b132a\" (UID: \"fb1dd6e9-8801-464e-952c-d345498b132a\") " Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.384809 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-scripts" (OuterVolumeSpecName: "scripts") pod "fb1dd6e9-8801-464e-952c-d345498b132a" (UID: "fb1dd6e9-8801-464e-952c-d345498b132a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.387685 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb1dd6e9-8801-464e-952c-d345498b132a-kube-api-access-5fmmn" (OuterVolumeSpecName: "kube-api-access-5fmmn") pod "fb1dd6e9-8801-464e-952c-d345498b132a" (UID: "fb1dd6e9-8801-464e-952c-d345498b132a"). InnerVolumeSpecName "kube-api-access-5fmmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.436149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb1dd6e9-8801-464e-952c-d345498b132a" (UID: "fb1dd6e9-8801-464e-952c-d345498b132a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.438990 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-config-data" (OuterVolumeSpecName: "config-data") pod "fb1dd6e9-8801-464e-952c-d345498b132a" (UID: "fb1dd6e9-8801-464e-952c-d345498b132a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.474472 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fmmn\" (UniqueName: \"kubernetes.io/projected/fb1dd6e9-8801-464e-952c-d345498b132a-kube-api-access-5fmmn\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.474499 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.474507 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.474519 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb1dd6e9-8801-464e-952c-d345498b132a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.672954 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82600d19-b64e-4737-8a1f-ff1289daa3ed" path="/var/lib/kubelet/pods/82600d19-b64e-4737-8a1f-ff1289daa3ed/volumes" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.673569 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab91b8d2-32c9-4542-a730-7fc84e9892bd" path="/var/lib/kubelet/pods/ab91b8d2-32c9-4542-a730-7fc84e9892bd/volumes" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.852984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"636228b7-669b-4b5d-abec-bf78cb1513f0","Type":"ContainerStarted","Data":"88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc"} Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.853033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"636228b7-669b-4b5d-abec-bf78cb1513f0","Type":"ContainerStarted","Data":"9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc"} Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.853048 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"636228b7-669b-4b5d-abec-bf78cb1513f0","Type":"ContainerStarted","Data":"c6bcff38ed2ee79c0028a43362b2200b49c362342fd98956493476b35ddff899"} Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.855251 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vd764" event={"ID":"fb1dd6e9-8801-464e-952c-d345498b132a","Type":"ContainerDied","Data":"5e30087cecb9bbd6f3df2a364239f70f52ef2de20ec9c4dfd84fb90814b44306"} Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.855299 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e30087cecb9bbd6f3df2a364239f70f52ef2de20ec9c4dfd84fb90814b44306" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.855308 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vd764" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.857613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerStarted","Data":"fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb"} Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.857646 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerStarted","Data":"bd088ed9e7536178799aa0168c0a00723a8b5b8b146cbd4aa52d938131eb733b"} Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.873810 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.8737908349999999 podStartE2EDuration="1.873790835s" podCreationTimestamp="2026-03-21 05:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:21.868640489 +0000 UTC m=+1314.845104113" watchObservedRunningTime="2026-03-21 05:09:21.873790835 +0000 UTC m=+1314.850254459" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.939000 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 05:09:21 crc kubenswrapper[4775]: E0321 05:09:21.939743 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb1dd6e9-8801-464e-952c-d345498b132a" containerName="nova-cell1-conductor-db-sync" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.939764 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb1dd6e9-8801-464e-952c-d345498b132a" containerName="nova-cell1-conductor-db-sync" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.939939 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb1dd6e9-8801-464e-952c-d345498b132a" containerName="nova-cell1-conductor-db-sync" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.940560 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.944348 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.950136 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.985218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29323a7-476f-4a13-8085-4b2158a68850-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.985259 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl24b\" (UniqueName: \"kubernetes.io/projected/b29323a7-476f-4a13-8085-4b2158a68850-kube-api-access-kl24b\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:21 crc kubenswrapper[4775]: I0321 05:09:21.985292 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29323a7-476f-4a13-8085-4b2158a68850-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.087555 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29323a7-476f-4a13-8085-4b2158a68850-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.087617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl24b\" (UniqueName: \"kubernetes.io/projected/b29323a7-476f-4a13-8085-4b2158a68850-kube-api-access-kl24b\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.087652 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29323a7-476f-4a13-8085-4b2158a68850-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.094881 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29323a7-476f-4a13-8085-4b2158a68850-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.094928 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29323a7-476f-4a13-8085-4b2158a68850-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.107572 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl24b\" (UniqueName: \"kubernetes.io/projected/b29323a7-476f-4a13-8085-4b2158a68850-kube-api-access-kl24b\") pod \"nova-cell1-conductor-0\" (UID: \"b29323a7-476f-4a13-8085-4b2158a68850\") " pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.258659 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:22 crc kubenswrapper[4775]: W0321 05:09:22.786145 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb29323a7_476f_4a13_8085_4b2158a68850.slice/crio-00db731407ff126bb6e108665e2bc8cb1ba2533f6f82c1e28ee6f27a51895049 WatchSource:0}: Error finding container 00db731407ff126bb6e108665e2bc8cb1ba2533f6f82c1e28ee6f27a51895049: Status 404 returned error can't find the container with id 00db731407ff126bb6e108665e2bc8cb1ba2533f6f82c1e28ee6f27a51895049 Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.791021 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.870968 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b29323a7-476f-4a13-8085-4b2158a68850","Type":"ContainerStarted","Data":"00db731407ff126bb6e108665e2bc8cb1ba2533f6f82c1e28ee6f27a51895049"} Mar 21 05:09:22 crc kubenswrapper[4775]: I0321 05:09:22.874174 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerStarted","Data":"e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d"} Mar 21 05:09:23 crc kubenswrapper[4775]: E0321 05:09:23.057297 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:09:23 crc kubenswrapper[4775]: E0321 05:09:23.058510 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:09:23 crc kubenswrapper[4775]: E0321 05:09:23.082518 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:09:23 crc kubenswrapper[4775]: E0321 05:09:23.082601 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e114fa87-9e65-4a0b-bad1-81f521aeef85" containerName="nova-scheduler-scheduler" Mar 21 05:09:23 crc kubenswrapper[4775]: I0321 05:09:23.886816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b29323a7-476f-4a13-8085-4b2158a68850","Type":"ContainerStarted","Data":"16628276422a39d68f8845f2fe31fdf04cb116e0f714c01cf1ba2a054a43b6d1"} Mar 21 05:09:23 crc kubenswrapper[4775]: I0321 05:09:23.887461 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:23 crc kubenswrapper[4775]: I0321 05:09:23.890499 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerStarted","Data":"8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db"} Mar 21 05:09:23 crc kubenswrapper[4775]: I0321 05:09:23.919374 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.9193526419999998 podStartE2EDuration="2.919352642s" podCreationTimestamp="2026-03-21 05:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:23.908525136 +0000 UTC m=+1316.884988770" watchObservedRunningTime="2026-03-21 05:09:23.919352642 +0000 UTC m=+1316.895816286" Mar 21 05:09:24 crc kubenswrapper[4775]: I0321 05:09:24.904105 4775 generic.go:334] "Generic (PLEG): container finished" podID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerID="3d41a9111ff02a22981deef2e158c7aaae04c761a3f477f9cde2ecff8054c0f1" exitCode=0 Mar 21 05:09:24 crc kubenswrapper[4775]: I0321 05:09:24.904149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"87dad904-a84b-4dcf-9caa-3fe89053b96b","Type":"ContainerDied","Data":"3d41a9111ff02a22981deef2e158c7aaae04c761a3f477f9cde2ecff8054c0f1"} Mar 21 05:09:24 crc kubenswrapper[4775]: I0321 05:09:24.904585 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"87dad904-a84b-4dcf-9caa-3fe89053b96b","Type":"ContainerDied","Data":"3c21cc3a28ac294006c3d112d32cbbbabe9d8f787077b90e376062aeec6dab3e"} Mar 21 05:09:24 crc kubenswrapper[4775]: I0321 05:09:24.904605 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c21cc3a28ac294006c3d112d32cbbbabe9d8f787077b90e376062aeec6dab3e" Mar 21 05:09:24 crc kubenswrapper[4775]: I0321 05:09:24.906630 4775 generic.go:334] "Generic (PLEG): container finished" podID="e114fa87-9e65-4a0b-bad1-81f521aeef85" containerID="4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a" exitCode=0 Mar 21 05:09:24 crc kubenswrapper[4775]: I0321 05:09:24.906695 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e114fa87-9e65-4a0b-bad1-81f521aeef85","Type":"ContainerDied","Data":"4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a"} Mar 21 05:09:24 crc kubenswrapper[4775]: I0321 05:09:24.999819 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.036373 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87dad904-a84b-4dcf-9caa-3fe89053b96b-logs\") pod \"87dad904-a84b-4dcf-9caa-3fe89053b96b\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.036460 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txpj5\" (UniqueName: \"kubernetes.io/projected/87dad904-a84b-4dcf-9caa-3fe89053b96b-kube-api-access-txpj5\") pod \"87dad904-a84b-4dcf-9caa-3fe89053b96b\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.036502 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-config-data\") pod \"87dad904-a84b-4dcf-9caa-3fe89053b96b\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.036539 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-combined-ca-bundle\") pod \"87dad904-a84b-4dcf-9caa-3fe89053b96b\" (UID: \"87dad904-a84b-4dcf-9caa-3fe89053b96b\") " Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.036887 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87dad904-a84b-4dcf-9caa-3fe89053b96b-logs" (OuterVolumeSpecName: "logs") pod "87dad904-a84b-4dcf-9caa-3fe89053b96b" (UID: "87dad904-a84b-4dcf-9caa-3fe89053b96b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.044356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87dad904-a84b-4dcf-9caa-3fe89053b96b-kube-api-access-txpj5" (OuterVolumeSpecName: "kube-api-access-txpj5") pod "87dad904-a84b-4dcf-9caa-3fe89053b96b" (UID: "87dad904-a84b-4dcf-9caa-3fe89053b96b"). InnerVolumeSpecName "kube-api-access-txpj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.110332 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-config-data" (OuterVolumeSpecName: "config-data") pod "87dad904-a84b-4dcf-9caa-3fe89053b96b" (UID: "87dad904-a84b-4dcf-9caa-3fe89053b96b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.114490 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87dad904-a84b-4dcf-9caa-3fe89053b96b" (UID: "87dad904-a84b-4dcf-9caa-3fe89053b96b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.137708 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87dad904-a84b-4dcf-9caa-3fe89053b96b-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.137730 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txpj5\" (UniqueName: \"kubernetes.io/projected/87dad904-a84b-4dcf-9caa-3fe89053b96b-kube-api-access-txpj5\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.137740 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.137750 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87dad904-a84b-4dcf-9caa-3fe89053b96b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.339375 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.441588 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-config-data\") pod \"e114fa87-9e65-4a0b-bad1-81f521aeef85\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.441648 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-combined-ca-bundle\") pod \"e114fa87-9e65-4a0b-bad1-81f521aeef85\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.441819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpl5g\" (UniqueName: \"kubernetes.io/projected/e114fa87-9e65-4a0b-bad1-81f521aeef85-kube-api-access-xpl5g\") pod \"e114fa87-9e65-4a0b-bad1-81f521aeef85\" (UID: \"e114fa87-9e65-4a0b-bad1-81f521aeef85\") " Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.445130 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e114fa87-9e65-4a0b-bad1-81f521aeef85-kube-api-access-xpl5g" (OuterVolumeSpecName: "kube-api-access-xpl5g") pod "e114fa87-9e65-4a0b-bad1-81f521aeef85" (UID: "e114fa87-9e65-4a0b-bad1-81f521aeef85"). InnerVolumeSpecName "kube-api-access-xpl5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.484773 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e114fa87-9e65-4a0b-bad1-81f521aeef85" (UID: "e114fa87-9e65-4a0b-bad1-81f521aeef85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.516172 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-config-data" (OuterVolumeSpecName: "config-data") pod "e114fa87-9e65-4a0b-bad1-81f521aeef85" (UID: "e114fa87-9e65-4a0b-bad1-81f521aeef85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.547062 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpl5g\" (UniqueName: \"kubernetes.io/projected/e114fa87-9e65-4a0b-bad1-81f521aeef85-kube-api-access-xpl5g\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.547420 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.547433 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e114fa87-9e65-4a0b-bad1-81f521aeef85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.928612 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerStarted","Data":"1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9"} Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.929369 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.931140 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.931860 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.932232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e114fa87-9e65-4a0b-bad1-81f521aeef85","Type":"ContainerDied","Data":"8a454c45ca71508ee3e69e89e31c30561bb964893c09c7b3c20372f820d152ee"} Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.932268 4775 scope.go:117] "RemoveContainer" containerID="4db8a2108514793dd399803fe0f901aa4854b0ae09ceaf27535063e2f431666a" Mar 21 05:09:25 crc kubenswrapper[4775]: I0321 05:09:25.950234 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.604476482 podStartE2EDuration="5.950212725s" podCreationTimestamp="2026-03-21 05:09:20 +0000 UTC" firstStartedPulling="2026-03-21 05:09:21.095857523 +0000 UTC m=+1314.072321147" lastFinishedPulling="2026-03-21 05:09:25.441593766 +0000 UTC m=+1318.418057390" observedRunningTime="2026-03-21 05:09:25.947934851 +0000 UTC m=+1318.924398475" watchObservedRunningTime="2026-03-21 05:09:25.950212725 +0000 UTC m=+1318.926676349" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.008863 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.020108 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.044110 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.053235 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: E0321 05:09:26.053929 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-log" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.053954 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-log" Mar 21 05:09:26 crc kubenswrapper[4775]: E0321 05:09:26.053988 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-api" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.053997 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-api" Mar 21 05:09:26 crc kubenswrapper[4775]: E0321 05:09:26.054014 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e114fa87-9e65-4a0b-bad1-81f521aeef85" containerName="nova-scheduler-scheduler" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.054020 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e114fa87-9e65-4a0b-bad1-81f521aeef85" containerName="nova-scheduler-scheduler" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.054248 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e114fa87-9e65-4a0b-bad1-81f521aeef85" containerName="nova-scheduler-scheduler" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.054268 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-log" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.054285 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" containerName="nova-api-api" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.054867 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.060037 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.071928 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.082005 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.089452 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.091027 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.092586 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.099920 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.160537 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-config-data\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.160640 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.160665 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4895fa-5b3e-423f-85aa-030d2db076b3-logs\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.160765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krq52\" (UniqueName: \"kubernetes.io/projected/7d4895fa-5b3e-423f-85aa-030d2db076b3-kube-api-access-krq52\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.160795 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.160812 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.160844 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jkv8\" (UniqueName: \"kubernetes.io/projected/0f713501-fa26-4ef2-87ac-c5376b84891d-kube-api-access-5jkv8\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.262787 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.262845 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.262888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jkv8\" (UniqueName: \"kubernetes.io/projected/0f713501-fa26-4ef2-87ac-c5376b84891d-kube-api-access-5jkv8\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.262965 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-config-data\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.263010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.263032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4895fa-5b3e-423f-85aa-030d2db076b3-logs\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.263065 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krq52\" (UniqueName: \"kubernetes.io/projected/7d4895fa-5b3e-423f-85aa-030d2db076b3-kube-api-access-krq52\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.264389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4895fa-5b3e-423f-85aa-030d2db076b3-logs\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.268096 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-config-data\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.270268 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.282892 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.285098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.285830 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jkv8\" (UniqueName: \"kubernetes.io/projected/0f713501-fa26-4ef2-87ac-c5376b84891d-kube-api-access-5jkv8\") pod \"nova-scheduler-0\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.286338 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krq52\" (UniqueName: \"kubernetes.io/projected/7d4895fa-5b3e-423f-85aa-030d2db076b3-kube-api-access-krq52\") pod \"nova-api-0\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.372186 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.416393 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.855929 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: W0321 05:09:26.866073 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f713501_fa26_4ef2_87ac_c5376b84891d.slice/crio-35d35b17b5e3ddd8509d6d47f432a3246971c8785668db26e3bf835f5d64e720 WatchSource:0}: Error finding container 35d35b17b5e3ddd8509d6d47f432a3246971c8785668db26e3bf835f5d64e720: Status 404 returned error can't find the container with id 35d35b17b5e3ddd8509d6d47f432a3246971c8785668db26e3bf835f5d64e720 Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.922150 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:26 crc kubenswrapper[4775]: W0321 05:09:26.928564 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d4895fa_5b3e_423f_85aa_030d2db076b3.slice/crio-d95657026d4984394319083a88a788fdbce021127cb6cbf909a8f8735dc34db8 WatchSource:0}: Error finding container d95657026d4984394319083a88a788fdbce021127cb6cbf909a8f8735dc34db8: Status 404 returned error can't find the container with id d95657026d4984394319083a88a788fdbce021127cb6cbf909a8f8735dc34db8 Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.947346 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d4895fa-5b3e-423f-85aa-030d2db076b3","Type":"ContainerStarted","Data":"d95657026d4984394319083a88a788fdbce021127cb6cbf909a8f8735dc34db8"} Mar 21 05:09:26 crc kubenswrapper[4775]: I0321 05:09:26.950856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f713501-fa26-4ef2-87ac-c5376b84891d","Type":"ContainerStarted","Data":"35d35b17b5e3ddd8509d6d47f432a3246971c8785668db26e3bf835f5d64e720"} Mar 21 05:09:27 crc kubenswrapper[4775]: I0321 05:09:27.306040 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 21 05:09:27 crc kubenswrapper[4775]: I0321 05:09:27.702908 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87dad904-a84b-4dcf-9caa-3fe89053b96b" path="/var/lib/kubelet/pods/87dad904-a84b-4dcf-9caa-3fe89053b96b/volumes" Mar 21 05:09:27 crc kubenswrapper[4775]: I0321 05:09:27.703822 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e114fa87-9e65-4a0b-bad1-81f521aeef85" path="/var/lib/kubelet/pods/e114fa87-9e65-4a0b-bad1-81f521aeef85/volumes" Mar 21 05:09:27 crc kubenswrapper[4775]: I0321 05:09:27.982368 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d4895fa-5b3e-423f-85aa-030d2db076b3","Type":"ContainerStarted","Data":"ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285"} Mar 21 05:09:27 crc kubenswrapper[4775]: I0321 05:09:27.982423 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d4895fa-5b3e-423f-85aa-030d2db076b3","Type":"ContainerStarted","Data":"ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448"} Mar 21 05:09:27 crc kubenswrapper[4775]: I0321 05:09:27.984893 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f713501-fa26-4ef2-87ac-c5376b84891d","Type":"ContainerStarted","Data":"c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2"} Mar 21 05:09:28 crc kubenswrapper[4775]: I0321 05:09:28.019712 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.019692629 podStartE2EDuration="3.019692629s" podCreationTimestamp="2026-03-21 05:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:28.012452044 +0000 UTC m=+1320.988915678" watchObservedRunningTime="2026-03-21 05:09:28.019692629 +0000 UTC m=+1320.996156253" Mar 21 05:09:28 crc kubenswrapper[4775]: I0321 05:09:28.030097 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.030077242 podStartE2EDuration="3.030077242s" podCreationTimestamp="2026-03-21 05:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:28.02857394 +0000 UTC m=+1321.005037564" watchObservedRunningTime="2026-03-21 05:09:28.030077242 +0000 UTC m=+1321.006540866" Mar 21 05:09:30 crc kubenswrapper[4775]: I0321 05:09:30.583692 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:09:30 crc kubenswrapper[4775]: I0321 05:09:30.584088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:09:31 crc kubenswrapper[4775]: I0321 05:09:31.375338 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 05:09:31 crc kubenswrapper[4775]: I0321 05:09:31.600478 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:09:31 crc kubenswrapper[4775]: I0321 05:09:31.600463 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:09:36 crc kubenswrapper[4775]: I0321 05:09:36.373810 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 05:09:36 crc kubenswrapper[4775]: I0321 05:09:36.417050 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:09:36 crc kubenswrapper[4775]: I0321 05:09:36.417108 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:09:36 crc kubenswrapper[4775]: I0321 05:09:36.419536 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 05:09:37 crc kubenswrapper[4775]: I0321 05:09:37.094500 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 05:09:37 crc kubenswrapper[4775]: I0321 05:09:37.499447 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:09:37 crc kubenswrapper[4775]: I0321 05:09:37.499466 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:09:38 crc kubenswrapper[4775]: I0321 05:09:38.583208 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:09:38 crc kubenswrapper[4775]: I0321 05:09:38.583270 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:09:40 crc kubenswrapper[4775]: I0321 05:09:40.589501 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:09:40 crc kubenswrapper[4775]: I0321 05:09:40.591587 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:09:40 crc kubenswrapper[4775]: I0321 05:09:40.598573 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:09:41 crc kubenswrapper[4775]: I0321 05:09:41.107517 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.028306 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.118282 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-config-data\") pod \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.118409 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-combined-ca-bundle\") pod \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.118793 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swsh9\" (UniqueName: \"kubernetes.io/projected/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-kube-api-access-swsh9\") pod \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\" (UID: \"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2\") " Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.126031 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-kube-api-access-swsh9" (OuterVolumeSpecName: "kube-api-access-swsh9") pod "e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" (UID: "e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2"). InnerVolumeSpecName "kube-api-access-swsh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.138608 4775 generic.go:334] "Generic (PLEG): container finished" podID="e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" containerID="bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa" exitCode=137 Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.138693 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.138754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2","Type":"ContainerDied","Data":"bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa"} Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.139266 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2","Type":"ContainerDied","Data":"e7ffc178120764777a6e0c0217aa92bdaf948da3b3e7aaf4502f70ada80abfb2"} Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.139307 4775 scope.go:117] "RemoveContainer" containerID="bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.150616 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-config-data" (OuterVolumeSpecName: "config-data") pod "e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" (UID: "e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.162088 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" (UID: "e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.221639 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.221690 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.221702 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swsh9\" (UniqueName: \"kubernetes.io/projected/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2-kube-api-access-swsh9\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.224006 4775 scope.go:117] "RemoveContainer" containerID="bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa" Mar 21 05:09:44 crc kubenswrapper[4775]: E0321 05:09:44.224510 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa\": container with ID starting with bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa not found: ID does not exist" containerID="bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.224544 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa"} err="failed to get container status \"bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa\": rpc error: code = NotFound desc = could not find container \"bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa\": container with ID starting with bdf97a0cf8b488689d63ad976ec7a07bd6c1bed2c6f6a773ddf11243bbcc10aa not found: ID does not exist" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.417528 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.417825 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.487500 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.501236 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.524797 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:44 crc kubenswrapper[4775]: E0321 05:09:44.525425 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.525456 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.525743 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.526696 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.529067 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.529987 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.530298 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.538649 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.627278 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx5bx\" (UniqueName: \"kubernetes.io/projected/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-kube-api-access-wx5bx\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.627383 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.627519 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.627616 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.627671 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.729750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.729918 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.730037 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.730114 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.730248 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx5bx\" (UniqueName: \"kubernetes.io/projected/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-kube-api-access-wx5bx\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.734247 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.734478 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.734920 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.736386 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.753150 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx5bx\" (UniqueName: \"kubernetes.io/projected/3b7ea443-e30d-41d1-9f42-0bef9d7bd012-kube-api-access-wx5bx\") pod \"nova-cell1-novncproxy-0\" (UID: \"3b7ea443-e30d-41d1-9f42-0bef9d7bd012\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:44 crc kubenswrapper[4775]: I0321 05:09:44.860132 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:45 crc kubenswrapper[4775]: W0321 05:09:45.330930 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b7ea443_e30d_41d1_9f42_0bef9d7bd012.slice/crio-b222bd3e248b9d0166f90bc7b6192e1078b253859cb36e93ca4b525699197a5f WatchSource:0}: Error finding container b222bd3e248b9d0166f90bc7b6192e1078b253859cb36e93ca4b525699197a5f: Status 404 returned error can't find the container with id b222bd3e248b9d0166f90bc7b6192e1078b253859cb36e93ca4b525699197a5f Mar 21 05:09:45 crc kubenswrapper[4775]: I0321 05:09:45.331983 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 05:09:45 crc kubenswrapper[4775]: I0321 05:09:45.680532 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2" path="/var/lib/kubelet/pods/e5cc1cb8-86fa-4708-bd8f-d8da98ebc0b2/volumes" Mar 21 05:09:46 crc kubenswrapper[4775]: I0321 05:09:46.157811 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3b7ea443-e30d-41d1-9f42-0bef9d7bd012","Type":"ContainerStarted","Data":"48d8fa4a899bf40e560d38784c8c256d93eddd6127cafea852c9a1cba4bc7d1b"} Mar 21 05:09:46 crc kubenswrapper[4775]: I0321 05:09:46.158329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3b7ea443-e30d-41d1-9f42-0bef9d7bd012","Type":"ContainerStarted","Data":"b222bd3e248b9d0166f90bc7b6192e1078b253859cb36e93ca4b525699197a5f"} Mar 21 05:09:46 crc kubenswrapper[4775]: I0321 05:09:46.182192 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.182173831 podStartE2EDuration="2.182173831s" podCreationTimestamp="2026-03-21 05:09:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:46.17542978 +0000 UTC m=+1339.151893404" watchObservedRunningTime="2026-03-21 05:09:46.182173831 +0000 UTC m=+1339.158637455" Mar 21 05:09:46 crc kubenswrapper[4775]: I0321 05:09:46.424755 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:09:46 crc kubenswrapper[4775]: I0321 05:09:46.425857 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:09:46 crc kubenswrapper[4775]: I0321 05:09:46.439539 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.168437 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.378897 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-rtnbf"] Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.380394 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.391378 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-rtnbf"] Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.429654 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.429752 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7xb\" (UniqueName: \"kubernetes.io/projected/d725aabc-ba32-4c0e-bc91-8819d73cae40-kube-api-access-dl7xb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.429779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.429802 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-config\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.429879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.429958 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.531905 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.531995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.532045 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.532082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7xb\" (UniqueName: \"kubernetes.io/projected/d725aabc-ba32-4c0e-bc91-8819d73cae40-kube-api-access-dl7xb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.532106 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.532132 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-config\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.533144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-config\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.533451 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.533529 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.533572 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.533875 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.559465 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7xb\" (UniqueName: \"kubernetes.io/projected/d725aabc-ba32-4c0e-bc91-8819d73cae40-kube-api-access-dl7xb\") pod \"dnsmasq-dns-cd5cbd7b9-rtnbf\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:47 crc kubenswrapper[4775]: I0321 05:09:47.740540 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:48 crc kubenswrapper[4775]: W0321 05:09:48.251274 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd725aabc_ba32_4c0e_bc91_8819d73cae40.slice/crio-5dc8047188086fad5c8d0b1fabab1913e5036d9e22f49d173fa13d6dd588e3c0 WatchSource:0}: Error finding container 5dc8047188086fad5c8d0b1fabab1913e5036d9e22f49d173fa13d6dd588e3c0: Status 404 returned error can't find the container with id 5dc8047188086fad5c8d0b1fabab1913e5036d9e22f49d173fa13d6dd588e3c0 Mar 21 05:09:48 crc kubenswrapper[4775]: I0321 05:09:48.255029 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-rtnbf"] Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.184506 4775 generic.go:334] "Generic (PLEG): container finished" podID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerID="e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974" exitCode=0 Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.184726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" event={"ID":"d725aabc-ba32-4c0e-bc91-8819d73cae40","Type":"ContainerDied","Data":"e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974"} Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.185165 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" event={"ID":"d725aabc-ba32-4c0e-bc91-8819d73cae40","Type":"ContainerStarted","Data":"5dc8047188086fad5c8d0b1fabab1913e5036d9e22f49d173fa13d6dd588e3c0"} Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.673954 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.674292 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-central-agent" containerID="cri-o://fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb" gracePeriod=30 Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.674336 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="sg-core" containerID="cri-o://8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db" gracePeriod=30 Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.674376 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-notification-agent" containerID="cri-o://e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d" gracePeriod=30 Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.674403 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="proxy-httpd" containerID="cri-o://1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9" gracePeriod=30 Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.698584 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.194:3000/\": EOF" Mar 21 05:09:49 crc kubenswrapper[4775]: I0321 05:09:49.860971 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.026143 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.205455 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerID="1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9" exitCode=0 Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.205488 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerID="8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db" exitCode=2 Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.205495 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerID="fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb" exitCode=0 Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.205546 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerDied","Data":"1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9"} Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.205588 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerDied","Data":"8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db"} Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.205642 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerDied","Data":"fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb"} Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.207611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" event={"ID":"d725aabc-ba32-4c0e-bc91-8819d73cae40","Type":"ContainerStarted","Data":"90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325"} Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.207723 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-log" containerID="cri-o://ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448" gracePeriod=30 Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.207975 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-api" containerID="cri-o://ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285" gracePeriod=30 Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.240686 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" podStartSLOduration=3.240671834 podStartE2EDuration="3.240671834s" podCreationTimestamp="2026-03-21 05:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:50.235471457 +0000 UTC m=+1343.211935081" watchObservedRunningTime="2026-03-21 05:09:50.240671834 +0000 UTC m=+1343.217135458" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.725847 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.900980 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-combined-ca-bundle\") pod \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.901332 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-scripts\") pod \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.901460 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-log-httpd\") pod \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.901500 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-config-data\") pod \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.901525 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-run-httpd\") pod \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.901619 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsrdg\" (UniqueName: \"kubernetes.io/projected/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-kube-api-access-zsrdg\") pod \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.901639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-sg-core-conf-yaml\") pod \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\" (UID: \"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53\") " Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.902271 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" (UID: "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.903326 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" (UID: "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.914222 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-scripts" (OuterVolumeSpecName: "scripts") pod "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" (UID: "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.915193 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-kube-api-access-zsrdg" (OuterVolumeSpecName: "kube-api-access-zsrdg") pod "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" (UID: "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53"). InnerVolumeSpecName "kube-api-access-zsrdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.931239 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" (UID: "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:50 crc kubenswrapper[4775]: I0321 05:09:50.991547 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" (UID: "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.005884 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.005914 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.005924 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.005932 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.005940 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.005951 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsrdg\" (UniqueName: \"kubernetes.io/projected/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-kube-api-access-zsrdg\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.021491 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-config-data" (OuterVolumeSpecName: "config-data") pod "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" (UID: "e3dd1c1b-7aba-4083-8a9c-037cd38b2d53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.108242 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.218694 4775 generic.go:334] "Generic (PLEG): container finished" podID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerID="e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d" exitCode=0 Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.218739 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.218759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerDied","Data":"e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d"} Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.218978 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e3dd1c1b-7aba-4083-8a9c-037cd38b2d53","Type":"ContainerDied","Data":"bd088ed9e7536178799aa0168c0a00723a8b5b8b146cbd4aa52d938131eb733b"} Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.219002 4775 scope.go:117] "RemoveContainer" containerID="1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.221500 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerID="ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448" exitCode=143 Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.221582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d4895fa-5b3e-423f-85aa-030d2db076b3","Type":"ContainerDied","Data":"ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448"} Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.221638 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.249757 4775 scope.go:117] "RemoveContainer" containerID="8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.254460 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.280705 4775 scope.go:117] "RemoveContainer" containerID="e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.283488 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.294648 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.295206 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="proxy-httpd" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295291 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="proxy-httpd" Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.295319 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-notification-agent" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295327 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-notification-agent" Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.295350 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-central-agent" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295359 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-central-agent" Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.295376 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="sg-core" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295385 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="sg-core" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295670 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-notification-agent" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295706 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="proxy-httpd" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295752 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="ceilometer-central-agent" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.295770 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="sg-core" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.297997 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.305292 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.307482 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.307790 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.332065 4775 scope.go:117] "RemoveContainer" containerID="fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.361983 4775 scope.go:117] "RemoveContainer" containerID="1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9" Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.362595 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9\": container with ID starting with 1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9 not found: ID does not exist" containerID="1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.362788 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9"} err="failed to get container status \"1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9\": rpc error: code = NotFound desc = could not find container \"1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9\": container with ID starting with 1019edd2ab7e9cd0ed64f98b04c2e3b4282ef0e3d503d9b2e72d9464a7033ab9 not found: ID does not exist" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.362893 4775 scope.go:117] "RemoveContainer" containerID="8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db" Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.363313 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db\": container with ID starting with 8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db not found: ID does not exist" containerID="8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.363364 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db"} err="failed to get container status \"8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db\": rpc error: code = NotFound desc = could not find container \"8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db\": container with ID starting with 8f1be7b1e4c2dbdc319160849ad419dc210086da61fe3aad3c1277cd3f4787db not found: ID does not exist" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.363400 4775 scope.go:117] "RemoveContainer" containerID="e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d" Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.363780 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d\": container with ID starting with e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d not found: ID does not exist" containerID="e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.363813 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d"} err="failed to get container status \"e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d\": rpc error: code = NotFound desc = could not find container \"e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d\": container with ID starting with e1897e25daedbe1665e342773e7750ff60df7bc2b0e9065d7163a3b3ead5e27d not found: ID does not exist" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.363832 4775 scope.go:117] "RemoveContainer" containerID="fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb" Mar 21 05:09:51 crc kubenswrapper[4775]: E0321 05:09:51.364076 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb\": container with ID starting with fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb not found: ID does not exist" containerID="fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.364205 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb"} err="failed to get container status \"fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb\": rpc error: code = NotFound desc = could not find container \"fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb\": container with ID starting with fbf261dc152da2105525608f99cd085cb605f4c4def074f0140516ec04df21eb not found: ID does not exist" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.412906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-log-httpd\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.412961 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d9sz\" (UniqueName: \"kubernetes.io/projected/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-kube-api-access-2d9sz\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.413005 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.413032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-config-data\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.413071 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.413144 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-run-httpd\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.413174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-scripts\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.514730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-log-httpd\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.515075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d9sz\" (UniqueName: \"kubernetes.io/projected/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-kube-api-access-2d9sz\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.515147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.515177 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-config-data\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.515214 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.515265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-run-httpd\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.515293 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-scripts\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.515902 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-log-httpd\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.516478 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-run-httpd\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.520740 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.521228 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.521737 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-config-data\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.526186 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-scripts\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.533275 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d9sz\" (UniqueName: \"kubernetes.io/projected/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-kube-api-access-2d9sz\") pod \"ceilometer-0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.628857 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.675265 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" path="/var/lib/kubelet/pods/e3dd1c1b-7aba-4083-8a9c-037cd38b2d53/volumes" Mar 21 05:09:51 crc kubenswrapper[4775]: I0321 05:09:51.724414 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:52 crc kubenswrapper[4775]: I0321 05:09:52.076236 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:09:52 crc kubenswrapper[4775]: W0321 05:09:52.078476 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e2c9d32_ac8d_41ed_ba32_d88dcaffe4b0.slice/crio-768b8ecf031f12d20a791f4f289282980e217faa8048c9e8011e33181c86d226 WatchSource:0}: Error finding container 768b8ecf031f12d20a791f4f289282980e217faa8048c9e8011e33181c86d226: Status 404 returned error can't find the container with id 768b8ecf031f12d20a791f4f289282980e217faa8048c9e8011e33181c86d226 Mar 21 05:09:52 crc kubenswrapper[4775]: I0321 05:09:52.230352 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerStarted","Data":"768b8ecf031f12d20a791f4f289282980e217faa8048c9e8011e33181c86d226"} Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.240209 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerStarted","Data":"09a1cf40ceee0947e10538878f75aaf051ddd9cfb50d6c16bb5d54ed58dffeea"} Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.799764 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.960464 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4895fa-5b3e-423f-85aa-030d2db076b3-logs\") pod \"7d4895fa-5b3e-423f-85aa-030d2db076b3\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.960778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-combined-ca-bundle\") pod \"7d4895fa-5b3e-423f-85aa-030d2db076b3\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.960974 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krq52\" (UniqueName: \"kubernetes.io/projected/7d4895fa-5b3e-423f-85aa-030d2db076b3-kube-api-access-krq52\") pod \"7d4895fa-5b3e-423f-85aa-030d2db076b3\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.961089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data\") pod \"7d4895fa-5b3e-423f-85aa-030d2db076b3\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.961261 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d4895fa-5b3e-423f-85aa-030d2db076b3-logs" (OuterVolumeSpecName: "logs") pod "7d4895fa-5b3e-423f-85aa-030d2db076b3" (UID: "7d4895fa-5b3e-423f-85aa-030d2db076b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.961567 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4895fa-5b3e-423f-85aa-030d2db076b3-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:53 crc kubenswrapper[4775]: I0321 05:09:53.969231 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4895fa-5b3e-423f-85aa-030d2db076b3-kube-api-access-krq52" (OuterVolumeSpecName: "kube-api-access-krq52") pod "7d4895fa-5b3e-423f-85aa-030d2db076b3" (UID: "7d4895fa-5b3e-423f-85aa-030d2db076b3"). InnerVolumeSpecName "kube-api-access-krq52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:53 crc kubenswrapper[4775]: E0321 05:09:53.997441 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data podName:7d4895fa-5b3e-423f-85aa-030d2db076b3 nodeName:}" failed. No retries permitted until 2026-03-21 05:09:54.497406806 +0000 UTC m=+1347.473870440 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data") pod "7d4895fa-5b3e-423f-85aa-030d2db076b3" (UID: "7d4895fa-5b3e-423f-85aa-030d2db076b3") : error deleting /var/lib/kubelet/pods/7d4895fa-5b3e-423f-85aa-030d2db076b3/volume-subpaths: remove /var/lib/kubelet/pods/7d4895fa-5b3e-423f-85aa-030d2db076b3/volume-subpaths: no such file or directory Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.001654 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d4895fa-5b3e-423f-85aa-030d2db076b3" (UID: "7d4895fa-5b3e-423f-85aa-030d2db076b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.062902 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krq52\" (UniqueName: \"kubernetes.io/projected/7d4895fa-5b3e-423f-85aa-030d2db076b3-kube-api-access-krq52\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.062935 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.250993 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerStarted","Data":"1d04a9e67e080d04335a90f9088873f5ae0ae56920f7eff7ab1e8e3ecca11bd2"} Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.253351 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerID="ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285" exitCode=0 Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.253378 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d4895fa-5b3e-423f-85aa-030d2db076b3","Type":"ContainerDied","Data":"ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285"} Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.253395 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d4895fa-5b3e-423f-85aa-030d2db076b3","Type":"ContainerDied","Data":"d95657026d4984394319083a88a788fdbce021127cb6cbf909a8f8735dc34db8"} Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.253411 4775 scope.go:117] "RemoveContainer" containerID="ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.253456 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.273696 4775 scope.go:117] "RemoveContainer" containerID="ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.363015 4775 scope.go:117] "RemoveContainer" containerID="ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285" Mar 21 05:09:54 crc kubenswrapper[4775]: E0321 05:09:54.363389 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285\": container with ID starting with ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285 not found: ID does not exist" containerID="ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.363424 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285"} err="failed to get container status \"ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285\": rpc error: code = NotFound desc = could not find container \"ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285\": container with ID starting with ec80243e9f9293a7a7355096362bcfe00d0fa0d7ef991f719431c39710253285 not found: ID does not exist" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.363445 4775 scope.go:117] "RemoveContainer" containerID="ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448" Mar 21 05:09:54 crc kubenswrapper[4775]: E0321 05:09:54.363778 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448\": container with ID starting with ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448 not found: ID does not exist" containerID="ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.363836 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448"} err="failed to get container status \"ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448\": rpc error: code = NotFound desc = could not find container \"ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448\": container with ID starting with ad57dd190fc32b9e6c21bd060d276edc5365c62c5bf0f9ffb91fab933879f448 not found: ID does not exist" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.573414 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data\") pod \"7d4895fa-5b3e-423f-85aa-030d2db076b3\" (UID: \"7d4895fa-5b3e-423f-85aa-030d2db076b3\") " Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.578270 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data" (OuterVolumeSpecName: "config-data") pod "7d4895fa-5b3e-423f-85aa-030d2db076b3" (UID: "7d4895fa-5b3e-423f-85aa-030d2db076b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.676199 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4895fa-5b3e-423f-85aa-030d2db076b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.860837 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.890431 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.898734 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.911836 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.914806 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:54 crc kubenswrapper[4775]: E0321 05:09:54.915260 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-log" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.915282 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-log" Mar 21 05:09:54 crc kubenswrapper[4775]: E0321 05:09:54.915310 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-api" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.915318 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-api" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.915495 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-api" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.915516 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" containerName="nova-api-log" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.916478 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.923617 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.925530 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.925568 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:09:54 crc kubenswrapper[4775]: I0321 05:09:54.931654 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.082740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jdpm\" (UniqueName: \"kubernetes.io/projected/fb3f1e05-6056-4a7f-9452-22caa26e74fd-kube-api-access-8jdpm\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.082839 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-config-data\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.082866 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3f1e05-6056-4a7f-9452-22caa26e74fd-logs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.082886 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.082917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.083310 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.185174 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-config-data\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.185237 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3f1e05-6056-4a7f-9452-22caa26e74fd-logs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.185273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.185315 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.185428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.185519 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jdpm\" (UniqueName: \"kubernetes.io/projected/fb3f1e05-6056-4a7f-9452-22caa26e74fd-kube-api-access-8jdpm\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.185756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3f1e05-6056-4a7f-9452-22caa26e74fd-logs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.189531 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.191040 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-config-data\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.191058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.197584 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.203232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jdpm\" (UniqueName: \"kubernetes.io/projected/fb3f1e05-6056-4a7f-9452-22caa26e74fd-kube-api-access-8jdpm\") pod \"nova-api-0\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.232564 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.268138 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerStarted","Data":"ce4a1712fe9b7303080889248fbdbd025014f38760cd0cb19ad06dd199164a50"} Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.290375 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.485221 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-s5ct6"] Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.486976 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.489287 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.490596 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.497158 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s5ct6"] Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.593235 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjnj\" (UniqueName: \"kubernetes.io/projected/f3172d76-da59-4dee-95df-6b74b7fb0033-kube-api-access-fnjnj\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.593433 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-scripts\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.593508 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.593602 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-config-data\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.674956 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4895fa-5b3e-423f-85aa-030d2db076b3" path="/var/lib/kubelet/pods/7d4895fa-5b3e-423f-85aa-030d2db076b3/volumes" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.695447 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjnj\" (UniqueName: \"kubernetes.io/projected/f3172d76-da59-4dee-95df-6b74b7fb0033-kube-api-access-fnjnj\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.695585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-scripts\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.695625 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.695668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-config-data\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.701372 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-config-data\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.703676 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-scripts\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.704029 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.713825 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.725090 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjnj\" (UniqueName: \"kubernetes.io/projected/f3172d76-da59-4dee-95df-6b74b7fb0033-kube-api-access-fnjnj\") pod \"nova-cell1-cell-mapping-s5ct6\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:55 crc kubenswrapper[4775]: I0321 05:09:55.817815 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.275559 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s5ct6"] Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.292551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3f1e05-6056-4a7f-9452-22caa26e74fd","Type":"ContainerStarted","Data":"4fa32c27636fc153967f4aaaf1df73d8fca4656fe14bc2f63c7966be0877d999"} Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.292604 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3f1e05-6056-4a7f-9452-22caa26e74fd","Type":"ContainerStarted","Data":"5417129905c56019139e8fe84dd07140ab4c584dde31a376f01ba295d9052432"} Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.292627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3f1e05-6056-4a7f-9452-22caa26e74fd","Type":"ContainerStarted","Data":"58500c9c4082d59c6b4aa8467cacee4491bef38fcd23aebf0ec570b68876648c"} Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.297968 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerStarted","Data":"c5a4cae04fe0bc9abff39e7ed6ec2b33edafb6e8a7275a05a5ea9574dcb70f60"} Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.298134 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-central-agent" containerID="cri-o://09a1cf40ceee0947e10538878f75aaf051ddd9cfb50d6c16bb5d54ed58dffeea" gracePeriod=30 Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.298334 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.298393 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="proxy-httpd" containerID="cri-o://c5a4cae04fe0bc9abff39e7ed6ec2b33edafb6e8a7275a05a5ea9574dcb70f60" gracePeriod=30 Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.298456 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="sg-core" containerID="cri-o://ce4a1712fe9b7303080889248fbdbd025014f38760cd0cb19ad06dd199164a50" gracePeriod=30 Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.298503 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-notification-agent" containerID="cri-o://1d04a9e67e080d04335a90f9088873f5ae0ae56920f7eff7ab1e8e3ecca11bd2" gracePeriod=30 Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.311866 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s5ct6" event={"ID":"f3172d76-da59-4dee-95df-6b74b7fb0033","Type":"ContainerStarted","Data":"d597814d14b2881f772c64b63db812d2cc5083310896232b321ea29b19b85684"} Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.315639 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.315617631 podStartE2EDuration="2.315617631s" podCreationTimestamp="2026-03-21 05:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:56.312289017 +0000 UTC m=+1349.288752651" watchObservedRunningTime="2026-03-21 05:09:56.315617631 +0000 UTC m=+1349.292081255" Mar 21 05:09:56 crc kubenswrapper[4775]: I0321 05:09:56.335615 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.460152588 podStartE2EDuration="5.335591336s" podCreationTimestamp="2026-03-21 05:09:51 +0000 UTC" firstStartedPulling="2026-03-21 05:09:52.080920327 +0000 UTC m=+1345.057383941" lastFinishedPulling="2026-03-21 05:09:55.956359065 +0000 UTC m=+1348.932822689" observedRunningTime="2026-03-21 05:09:56.329527024 +0000 UTC m=+1349.305990648" watchObservedRunningTime="2026-03-21 05:09:56.335591336 +0000 UTC m=+1349.312054970" Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.339681 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s5ct6" event={"ID":"f3172d76-da59-4dee-95df-6b74b7fb0033","Type":"ContainerStarted","Data":"244997576ccb04a6089eebd186997e7f409b2534428a82f578796006c7c674ed"} Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.345312 4775 generic.go:334] "Generic (PLEG): container finished" podID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerID="c5a4cae04fe0bc9abff39e7ed6ec2b33edafb6e8a7275a05a5ea9574dcb70f60" exitCode=0 Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.345341 4775 generic.go:334] "Generic (PLEG): container finished" podID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerID="ce4a1712fe9b7303080889248fbdbd025014f38760cd0cb19ad06dd199164a50" exitCode=2 Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.345350 4775 generic.go:334] "Generic (PLEG): container finished" podID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerID="1d04a9e67e080d04335a90f9088873f5ae0ae56920f7eff7ab1e8e3ecca11bd2" exitCode=0 Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.346111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerDied","Data":"c5a4cae04fe0bc9abff39e7ed6ec2b33edafb6e8a7275a05a5ea9574dcb70f60"} Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.346160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerDied","Data":"ce4a1712fe9b7303080889248fbdbd025014f38760cd0cb19ad06dd199164a50"} Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.346173 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerDied","Data":"1d04a9e67e080d04335a90f9088873f5ae0ae56920f7eff7ab1e8e3ecca11bd2"} Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.368784 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-s5ct6" podStartSLOduration=2.368747243 podStartE2EDuration="2.368747243s" podCreationTimestamp="2026-03-21 05:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:09:57.360533311 +0000 UTC m=+1350.336996935" watchObservedRunningTime="2026-03-21 05:09:57.368747243 +0000 UTC m=+1350.345210867" Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.742280 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.809470 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qsbbk"] Mar 21 05:09:57 crc kubenswrapper[4775]: I0321 05:09:57.809754 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerName="dnsmasq-dns" containerID="cri-o://210c52f72b30e6d06b1636b7f4ba7c385a206b219bca52c176d9da81b984a93a" gracePeriod=10 Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.356870 4775 generic.go:334] "Generic (PLEG): container finished" podID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerID="210c52f72b30e6d06b1636b7f4ba7c385a206b219bca52c176d9da81b984a93a" exitCode=0 Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.357582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" event={"ID":"20a97ec2-24be-494f-a3e8-dc7d202021c3","Type":"ContainerDied","Data":"210c52f72b30e6d06b1636b7f4ba7c385a206b219bca52c176d9da81b984a93a"} Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.357610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" event={"ID":"20a97ec2-24be-494f-a3e8-dc7d202021c3","Type":"ContainerDied","Data":"9a2e5054b120d9d9b114985bfe9a3a6397285644c20c24699a021ae58fcb7c61"} Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.357620 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a2e5054b120d9d9b114985bfe9a3a6397285644c20c24699a021ae58fcb7c61" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.410503 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.455374 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-swift-storage-0\") pod \"20a97ec2-24be-494f-a3e8-dc7d202021c3\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.455506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-svc\") pod \"20a97ec2-24be-494f-a3e8-dc7d202021c3\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.455552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-sb\") pod \"20a97ec2-24be-494f-a3e8-dc7d202021c3\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.455580 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptvg9\" (UniqueName: \"kubernetes.io/projected/20a97ec2-24be-494f-a3e8-dc7d202021c3-kube-api-access-ptvg9\") pod \"20a97ec2-24be-494f-a3e8-dc7d202021c3\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.455690 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-config\") pod \"20a97ec2-24be-494f-a3e8-dc7d202021c3\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.455727 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-nb\") pod \"20a97ec2-24be-494f-a3e8-dc7d202021c3\" (UID: \"20a97ec2-24be-494f-a3e8-dc7d202021c3\") " Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.472347 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a97ec2-24be-494f-a3e8-dc7d202021c3-kube-api-access-ptvg9" (OuterVolumeSpecName: "kube-api-access-ptvg9") pod "20a97ec2-24be-494f-a3e8-dc7d202021c3" (UID: "20a97ec2-24be-494f-a3e8-dc7d202021c3"). InnerVolumeSpecName "kube-api-access-ptvg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.529643 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "20a97ec2-24be-494f-a3e8-dc7d202021c3" (UID: "20a97ec2-24be-494f-a3e8-dc7d202021c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.531774 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "20a97ec2-24be-494f-a3e8-dc7d202021c3" (UID: "20a97ec2-24be-494f-a3e8-dc7d202021c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.536050 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "20a97ec2-24be-494f-a3e8-dc7d202021c3" (UID: "20a97ec2-24be-494f-a3e8-dc7d202021c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.557863 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "20a97ec2-24be-494f-a3e8-dc7d202021c3" (UID: "20a97ec2-24be-494f-a3e8-dc7d202021c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.559535 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.559567 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.559581 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptvg9\" (UniqueName: \"kubernetes.io/projected/20a97ec2-24be-494f-a3e8-dc7d202021c3-kube-api-access-ptvg9\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.559590 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.559601 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.570764 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-config" (OuterVolumeSpecName: "config") pod "20a97ec2-24be-494f-a3e8-dc7d202021c3" (UID: "20a97ec2-24be-494f-a3e8-dc7d202021c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:09:58 crc kubenswrapper[4775]: I0321 05:09:58.662391 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20a97ec2-24be-494f-a3e8-dc7d202021c3-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.371530 4775 generic.go:334] "Generic (PLEG): container finished" podID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerID="09a1cf40ceee0947e10538878f75aaf051ddd9cfb50d6c16bb5d54ed58dffeea" exitCode=0 Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.371631 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerDied","Data":"09a1cf40ceee0947e10538878f75aaf051ddd9cfb50d6c16bb5d54ed58dffeea"} Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.371945 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.430744 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qsbbk"] Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.476089 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-qsbbk"] Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.673347 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" path="/var/lib/kubelet/pods/20a97ec2-24be-494f-a3e8-dc7d202021c3/volumes" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.731594 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.784823 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-log-httpd\") pod \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.784989 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-sg-core-conf-yaml\") pod \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.785058 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d9sz\" (UniqueName: \"kubernetes.io/projected/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-kube-api-access-2d9sz\") pod \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.785137 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-combined-ca-bundle\") pod \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.785229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-run-httpd\") pod \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.785280 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-config-data\") pod \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.785296 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" (UID: "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.785324 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-scripts\") pod \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\" (UID: \"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0\") " Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.785738 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" (UID: "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.786029 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.786053 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.794279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-scripts" (OuterVolumeSpecName: "scripts") pod "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" (UID: "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.794447 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-kube-api-access-2d9sz" (OuterVolumeSpecName: "kube-api-access-2d9sz") pod "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" (UID: "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0"). InnerVolumeSpecName "kube-api-access-2d9sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.813844 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" (UID: "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.880998 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" (UID: "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.888322 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d9sz\" (UniqueName: \"kubernetes.io/projected/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-kube-api-access-2d9sz\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.888367 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.888377 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.888385 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.899418 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-config-data" (OuterVolumeSpecName: "config-data") pod "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" (UID: "3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:09:59 crc kubenswrapper[4775]: I0321 05:09:59.989866 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.143927 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567830-jv78x"] Mar 21 05:10:00 crc kubenswrapper[4775]: E0321 05:10:00.144383 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="sg-core" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144407 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="sg-core" Mar 21 05:10:00 crc kubenswrapper[4775]: E0321 05:10:00.144430 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerName="dnsmasq-dns" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144438 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerName="dnsmasq-dns" Mar 21 05:10:00 crc kubenswrapper[4775]: E0321 05:10:00.144461 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-central-agent" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144469 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-central-agent" Mar 21 05:10:00 crc kubenswrapper[4775]: E0321 05:10:00.144488 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-notification-agent" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144496 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-notification-agent" Mar 21 05:10:00 crc kubenswrapper[4775]: E0321 05:10:00.144510 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerName="init" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144517 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerName="init" Mar 21 05:10:00 crc kubenswrapper[4775]: E0321 05:10:00.144532 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="proxy-httpd" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144540 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="proxy-httpd" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144735 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="proxy-httpd" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144759 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-central-agent" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144774 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerName="dnsmasq-dns" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144785 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="sg-core" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.144803 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" containerName="ceilometer-notification-agent" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.145528 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-jv78x" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.149657 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.149689 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.149917 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.155547 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-jv78x"] Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.193452 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns84k\" (UniqueName: \"kubernetes.io/projected/33cb4f91-aca0-4221-873f-e2c16e30ccee-kube-api-access-ns84k\") pod \"auto-csr-approver-29567830-jv78x\" (UID: \"33cb4f91-aca0-4221-873f-e2c16e30ccee\") " pod="openshift-infra/auto-csr-approver-29567830-jv78x" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.295199 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns84k\" (UniqueName: \"kubernetes.io/projected/33cb4f91-aca0-4221-873f-e2c16e30ccee-kube-api-access-ns84k\") pod \"auto-csr-approver-29567830-jv78x\" (UID: \"33cb4f91-aca0-4221-873f-e2c16e30ccee\") " pod="openshift-infra/auto-csr-approver-29567830-jv78x" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.314817 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns84k\" (UniqueName: \"kubernetes.io/projected/33cb4f91-aca0-4221-873f-e2c16e30ccee-kube-api-access-ns84k\") pod \"auto-csr-approver-29567830-jv78x\" (UID: \"33cb4f91-aca0-4221-873f-e2c16e30ccee\") " pod="openshift-infra/auto-csr-approver-29567830-jv78x" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.395858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0","Type":"ContainerDied","Data":"768b8ecf031f12d20a791f4f289282980e217faa8048c9e8011e33181c86d226"} Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.396973 4775 scope.go:117] "RemoveContainer" containerID="c5a4cae04fe0bc9abff39e7ed6ec2b33edafb6e8a7275a05a5ea9574dcb70f60" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.395921 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.448330 4775 scope.go:117] "RemoveContainer" containerID="ce4a1712fe9b7303080889248fbdbd025014f38760cd0cb19ad06dd199164a50" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.454524 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.463536 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.465268 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-jv78x" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.479919 4775 scope.go:117] "RemoveContainer" containerID="1d04a9e67e080d04335a90f9088873f5ae0ae56920f7eff7ab1e8e3ecca11bd2" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.482430 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.485353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.489100 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.489715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.499827 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.528859 4775 scope.go:117] "RemoveContainer" containerID="09a1cf40ceee0947e10538878f75aaf051ddd9cfb50d6c16bb5d54ed58dffeea" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.600048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk4fk\" (UniqueName: \"kubernetes.io/projected/7fdf78a6-b107-4949-affa-6152d15afda0-kube-api-access-lk4fk\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.600145 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.600184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-config-data\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.600269 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-log-httpd\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.600298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-run-httpd\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.600345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.600436 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-scripts\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.701949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.702456 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-scripts\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.702581 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk4fk\" (UniqueName: \"kubernetes.io/projected/7fdf78a6-b107-4949-affa-6152d15afda0-kube-api-access-lk4fk\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.702673 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.702706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-config-data\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.702731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-log-httpd\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.702757 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-run-httpd\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.704239 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-log-httpd\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.704335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-run-httpd\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.706829 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-scripts\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.720110 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.720783 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.721829 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-config-data\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.725544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk4fk\" (UniqueName: \"kubernetes.io/projected/7fdf78a6-b107-4949-affa-6152d15afda0-kube-api-access-lk4fk\") pod \"ceilometer-0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.849410 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:10:00 crc kubenswrapper[4775]: I0321 05:10:00.969424 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-jv78x"] Mar 21 05:10:00 crc kubenswrapper[4775]: W0321 05:10:00.983386 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33cb4f91_aca0_4221_873f_e2c16e30ccee.slice/crio-e13af9720ea7d69a98c04ade641d3957d885af03c65baffd94332be373fb4ff3 WatchSource:0}: Error finding container e13af9720ea7d69a98c04ade641d3957d885af03c65baffd94332be373fb4ff3: Status 404 returned error can't find the container with id e13af9720ea7d69a98c04ade641d3957d885af03c65baffd94332be373fb4ff3 Mar 21 05:10:01 crc kubenswrapper[4775]: I0321 05:10:01.324673 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:01 crc kubenswrapper[4775]: I0321 05:10:01.407646 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-jv78x" event={"ID":"33cb4f91-aca0-4221-873f-e2c16e30ccee","Type":"ContainerStarted","Data":"e13af9720ea7d69a98c04ade641d3957d885af03c65baffd94332be373fb4ff3"} Mar 21 05:10:01 crc kubenswrapper[4775]: I0321 05:10:01.411349 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerStarted","Data":"0199a14e4f5a12a8019f727e78e7f15430a4ee84de969cda88096ca15bbed15b"} Mar 21 05:10:01 crc kubenswrapper[4775]: I0321 05:10:01.673489 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0" path="/var/lib/kubelet/pods/3e2c9d32-ac8d-41ed-ba32-d88dcaffe4b0/volumes" Mar 21 05:10:02 crc kubenswrapper[4775]: I0321 05:10:02.422189 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3172d76-da59-4dee-95df-6b74b7fb0033" containerID="244997576ccb04a6089eebd186997e7f409b2534428a82f578796006c7c674ed" exitCode=0 Mar 21 05:10:02 crc kubenswrapper[4775]: I0321 05:10:02.422305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s5ct6" event={"ID":"f3172d76-da59-4dee-95df-6b74b7fb0033","Type":"ContainerDied","Data":"244997576ccb04a6089eebd186997e7f409b2534428a82f578796006c7c674ed"} Mar 21 05:10:02 crc kubenswrapper[4775]: I0321 05:10:02.482333 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:10:02 crc kubenswrapper[4775]: I0321 05:10:02.482412 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.106752 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bccf8f775-qsbbk" podUID="20a97ec2-24be-494f-a3e8-dc7d202021c3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: i/o timeout" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.432745 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerStarted","Data":"8f8cabcb03903e4d89761a18703dd5c97870cd2e3d0ef375a4910ff30a8d99cf"} Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.794422 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.862720 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-scripts\") pod \"f3172d76-da59-4dee-95df-6b74b7fb0033\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.862813 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-config-data\") pod \"f3172d76-da59-4dee-95df-6b74b7fb0033\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.862921 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnjnj\" (UniqueName: \"kubernetes.io/projected/f3172d76-da59-4dee-95df-6b74b7fb0033-kube-api-access-fnjnj\") pod \"f3172d76-da59-4dee-95df-6b74b7fb0033\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.862978 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-combined-ca-bundle\") pod \"f3172d76-da59-4dee-95df-6b74b7fb0033\" (UID: \"f3172d76-da59-4dee-95df-6b74b7fb0033\") " Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.868471 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-scripts" (OuterVolumeSpecName: "scripts") pod "f3172d76-da59-4dee-95df-6b74b7fb0033" (UID: "f3172d76-da59-4dee-95df-6b74b7fb0033"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.868945 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3172d76-da59-4dee-95df-6b74b7fb0033-kube-api-access-fnjnj" (OuterVolumeSpecName: "kube-api-access-fnjnj") pod "f3172d76-da59-4dee-95df-6b74b7fb0033" (UID: "f3172d76-da59-4dee-95df-6b74b7fb0033"). InnerVolumeSpecName "kube-api-access-fnjnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.894238 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3172d76-da59-4dee-95df-6b74b7fb0033" (UID: "f3172d76-da59-4dee-95df-6b74b7fb0033"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.894608 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-config-data" (OuterVolumeSpecName: "config-data") pod "f3172d76-da59-4dee-95df-6b74b7fb0033" (UID: "f3172d76-da59-4dee-95df-6b74b7fb0033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.964732 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.964761 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.964770 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3172d76-da59-4dee-95df-6b74b7fb0033-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:03 crc kubenswrapper[4775]: I0321 05:10:03.964778 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnjnj\" (UniqueName: \"kubernetes.io/projected/f3172d76-da59-4dee-95df-6b74b7fb0033-kube-api-access-fnjnj\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.444997 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s5ct6" event={"ID":"f3172d76-da59-4dee-95df-6b74b7fb0033","Type":"ContainerDied","Data":"d597814d14b2881f772c64b63db812d2cc5083310896232b321ea29b19b85684"} Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.445365 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d597814d14b2881f772c64b63db812d2cc5083310896232b321ea29b19b85684" Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.445438 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s5ct6" Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.632573 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.632804 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0f713501-fa26-4ef2-87ac-c5376b84891d" containerName="nova-scheduler-scheduler" containerID="cri-o://c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2" gracePeriod=30 Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.650762 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.651079 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-log" containerID="cri-o://5417129905c56019139e8fe84dd07140ab4c584dde31a376f01ba295d9052432" gracePeriod=30 Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.651261 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-api" containerID="cri-o://4fa32c27636fc153967f4aaaf1df73d8fca4656fe14bc2f63c7966be0877d999" gracePeriod=30 Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.666606 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.666834 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-log" containerID="cri-o://9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc" gracePeriod=30 Mar 21 05:10:04 crc kubenswrapper[4775]: I0321 05:10:04.666964 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-metadata" containerID="cri-o://88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc" gracePeriod=30 Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.459584 4775 generic.go:334] "Generic (PLEG): container finished" podID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerID="4fa32c27636fc153967f4aaaf1df73d8fca4656fe14bc2f63c7966be0877d999" exitCode=0 Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.459906 4775 generic.go:334] "Generic (PLEG): container finished" podID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerID="5417129905c56019139e8fe84dd07140ab4c584dde31a376f01ba295d9052432" exitCode=143 Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.459801 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3f1e05-6056-4a7f-9452-22caa26e74fd","Type":"ContainerDied","Data":"4fa32c27636fc153967f4aaaf1df73d8fca4656fe14bc2f63c7966be0877d999"} Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.459985 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3f1e05-6056-4a7f-9452-22caa26e74fd","Type":"ContainerDied","Data":"5417129905c56019139e8fe84dd07140ab4c584dde31a376f01ba295d9052432"} Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.462189 4775 generic.go:334] "Generic (PLEG): container finished" podID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerID="9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc" exitCode=143 Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.462245 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"636228b7-669b-4b5d-abec-bf78cb1513f0","Type":"ContainerDied","Data":"9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc"} Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.464222 4775 generic.go:334] "Generic (PLEG): container finished" podID="33cb4f91-aca0-4221-873f-e2c16e30ccee" containerID="96a2bef185388797d3fea57d8ac4fefdb91c53837dbce4eb9abcae899aab9b04" exitCode=0 Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.464272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-jv78x" event={"ID":"33cb4f91-aca0-4221-873f-e2c16e30ccee","Type":"ContainerDied","Data":"96a2bef185388797d3fea57d8ac4fefdb91c53837dbce4eb9abcae899aab9b04"} Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.861911 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.933395 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-public-tls-certs\") pod \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.933458 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3f1e05-6056-4a7f-9452-22caa26e74fd-logs\") pod \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.933482 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jdpm\" (UniqueName: \"kubernetes.io/projected/fb3f1e05-6056-4a7f-9452-22caa26e74fd-kube-api-access-8jdpm\") pod \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.933639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-combined-ca-bundle\") pod \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.933871 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3f1e05-6056-4a7f-9452-22caa26e74fd-logs" (OuterVolumeSpecName: "logs") pod "fb3f1e05-6056-4a7f-9452-22caa26e74fd" (UID: "fb3f1e05-6056-4a7f-9452-22caa26e74fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.934175 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-internal-tls-certs\") pod \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.934399 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-config-data\") pod \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\" (UID: \"fb3f1e05-6056-4a7f-9452-22caa26e74fd\") " Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.936025 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3f1e05-6056-4a7f-9452-22caa26e74fd-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.938730 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3f1e05-6056-4a7f-9452-22caa26e74fd-kube-api-access-8jdpm" (OuterVolumeSpecName: "kube-api-access-8jdpm") pod "fb3f1e05-6056-4a7f-9452-22caa26e74fd" (UID: "fb3f1e05-6056-4a7f-9452-22caa26e74fd"). InnerVolumeSpecName "kube-api-access-8jdpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.962064 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3f1e05-6056-4a7f-9452-22caa26e74fd" (UID: "fb3f1e05-6056-4a7f-9452-22caa26e74fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.971290 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-config-data" (OuterVolumeSpecName: "config-data") pod "fb3f1e05-6056-4a7f-9452-22caa26e74fd" (UID: "fb3f1e05-6056-4a7f-9452-22caa26e74fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.983982 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb3f1e05-6056-4a7f-9452-22caa26e74fd" (UID: "fb3f1e05-6056-4a7f-9452-22caa26e74fd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:05 crc kubenswrapper[4775]: I0321 05:10:05.991492 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb3f1e05-6056-4a7f-9452-22caa26e74fd" (UID: "fb3f1e05-6056-4a7f-9452-22caa26e74fd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.037619 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.037924 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.037936 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jdpm\" (UniqueName: \"kubernetes.io/projected/fb3f1e05-6056-4a7f-9452-22caa26e74fd-kube-api-access-8jdpm\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.037945 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.037953 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3f1e05-6056-4a7f-9452-22caa26e74fd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: E0321 05:10:06.374264 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2 is running failed: container process not found" containerID="c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:10:06 crc kubenswrapper[4775]: E0321 05:10:06.374700 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2 is running failed: container process not found" containerID="c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:10:06 crc kubenswrapper[4775]: E0321 05:10:06.375047 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2 is running failed: container process not found" containerID="c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 05:10:06 crc kubenswrapper[4775]: E0321 05:10:06.375086 4775 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0f713501-fa26-4ef2-87ac-c5376b84891d" containerName="nova-scheduler-scheduler" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.477434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3f1e05-6056-4a7f-9452-22caa26e74fd","Type":"ContainerDied","Data":"58500c9c4082d59c6b4aa8467cacee4491bef38fcd23aebf0ec570b68876648c"} Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.477493 4775 scope.go:117] "RemoveContainer" containerID="4fa32c27636fc153967f4aaaf1df73d8fca4656fe14bc2f63c7966be0877d999" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.477643 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.492369 4775 generic.go:334] "Generic (PLEG): container finished" podID="0f713501-fa26-4ef2-87ac-c5376b84891d" containerID="c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2" exitCode=0 Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.492479 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f713501-fa26-4ef2-87ac-c5376b84891d","Type":"ContainerDied","Data":"c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2"} Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.498295 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerStarted","Data":"5c6a6597ad7154fd7a253fa86eb6c5cde8565f20b6949e97002d664b75764604"} Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.525473 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.544103 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.556095 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 05:10:06 crc kubenswrapper[4775]: E0321 05:10:06.556619 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-log" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.556638 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-log" Mar 21 05:10:06 crc kubenswrapper[4775]: E0321 05:10:06.556660 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3172d76-da59-4dee-95df-6b74b7fb0033" containerName="nova-manage" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.556669 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3172d76-da59-4dee-95df-6b74b7fb0033" containerName="nova-manage" Mar 21 05:10:06 crc kubenswrapper[4775]: E0321 05:10:06.556691 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-api" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.556700 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-api" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.556937 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3172d76-da59-4dee-95df-6b74b7fb0033" containerName="nova-manage" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.556954 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-api" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.556975 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" containerName="nova-api-log" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.558262 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.560634 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.561722 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.563702 4775 scope.go:117] "RemoveContainer" containerID="5417129905c56019139e8fe84dd07140ab4c584dde31a376f01ba295d9052432" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.573226 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.576492 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.653153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.653494 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv8md\" (UniqueName: \"kubernetes.io/projected/886c404c-ceec-48e7-90da-96d6aa201152-kube-api-access-bv8md\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.653534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-internal-tls-certs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.653569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-config-data\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.653585 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886c404c-ceec-48e7-90da-96d6aa201152-logs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.653676 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-public-tls-certs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.691262 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754298 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-config-data\") pod \"0f713501-fa26-4ef2-87ac-c5376b84891d\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754431 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-combined-ca-bundle\") pod \"0f713501-fa26-4ef2-87ac-c5376b84891d\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754597 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jkv8\" (UniqueName: \"kubernetes.io/projected/0f713501-fa26-4ef2-87ac-c5376b84891d-kube-api-access-5jkv8\") pod \"0f713501-fa26-4ef2-87ac-c5376b84891d\" (UID: \"0f713501-fa26-4ef2-87ac-c5376b84891d\") " Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-internal-tls-certs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754840 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-config-data\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886c404c-ceec-48e7-90da-96d6aa201152-logs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754919 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-public-tls-certs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.754970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.755092 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv8md\" (UniqueName: \"kubernetes.io/projected/886c404c-ceec-48e7-90da-96d6aa201152-kube-api-access-bv8md\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.760614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/886c404c-ceec-48e7-90da-96d6aa201152-logs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.766000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.767347 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f713501-fa26-4ef2-87ac-c5376b84891d-kube-api-access-5jkv8" (OuterVolumeSpecName: "kube-api-access-5jkv8") pod "0f713501-fa26-4ef2-87ac-c5376b84891d" (UID: "0f713501-fa26-4ef2-87ac-c5376b84891d"). InnerVolumeSpecName "kube-api-access-5jkv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.769082 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-config-data\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.770401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-public-tls-certs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.776131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/886c404c-ceec-48e7-90da-96d6aa201152-internal-tls-certs\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.796245 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv8md\" (UniqueName: \"kubernetes.io/projected/886c404c-ceec-48e7-90da-96d6aa201152-kube-api-access-bv8md\") pod \"nova-api-0\" (UID: \"886c404c-ceec-48e7-90da-96d6aa201152\") " pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.808711 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-config-data" (OuterVolumeSpecName: "config-data") pod "0f713501-fa26-4ef2-87ac-c5376b84891d" (UID: "0f713501-fa26-4ef2-87ac-c5376b84891d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.838814 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f713501-fa26-4ef2-87ac-c5376b84891d" (UID: "0f713501-fa26-4ef2-87ac-c5376b84891d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.856327 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jkv8\" (UniqueName: \"kubernetes.io/projected/0f713501-fa26-4ef2-87ac-c5376b84891d-kube-api-access-5jkv8\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.856356 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.856365 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f713501-fa26-4ef2-87ac-c5376b84891d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.893931 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.905036 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-jv78x" Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.957355 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns84k\" (UniqueName: \"kubernetes.io/projected/33cb4f91-aca0-4221-873f-e2c16e30ccee-kube-api-access-ns84k\") pod \"33cb4f91-aca0-4221-873f-e2c16e30ccee\" (UID: \"33cb4f91-aca0-4221-873f-e2c16e30ccee\") " Mar 21 05:10:06 crc kubenswrapper[4775]: I0321 05:10:06.961142 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33cb4f91-aca0-4221-873f-e2c16e30ccee-kube-api-access-ns84k" (OuterVolumeSpecName: "kube-api-access-ns84k") pod "33cb4f91-aca0-4221-873f-e2c16e30ccee" (UID: "33cb4f91-aca0-4221-873f-e2c16e30ccee"). InnerVolumeSpecName "kube-api-access-ns84k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.059668 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns84k\" (UniqueName: \"kubernetes.io/projected/33cb4f91-aca0-4221-873f-e2c16e30ccee-kube-api-access-ns84k\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.491050 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.510813 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0f713501-fa26-4ef2-87ac-c5376b84891d","Type":"ContainerDied","Data":"35d35b17b5e3ddd8509d6d47f432a3246971c8785668db26e3bf835f5d64e720"} Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.510867 4775 scope.go:117] "RemoveContainer" containerID="c561875841331ee73e6bcb2e365ce80407bc0539d3ec5ad570942023fedeb9c2" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.510978 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.516715 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerStarted","Data":"02056fc0b14ba46d658b8a5aac716fe485d874dd4d3feb41a0e56ea871cdd2a0"} Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.526526 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-jv78x" event={"ID":"33cb4f91-aca0-4221-873f-e2c16e30ccee","Type":"ContainerDied","Data":"e13af9720ea7d69a98c04ade641d3957d885af03c65baffd94332be373fb4ff3"} Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.526571 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e13af9720ea7d69a98c04ade641d3957d885af03c65baffd94332be373fb4ff3" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.526625 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-jv78x" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.562170 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.585238 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.598664 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:10:07 crc kubenswrapper[4775]: E0321 05:10:07.599154 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33cb4f91-aca0-4221-873f-e2c16e30ccee" containerName="oc" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.599170 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cb4f91-aca0-4221-873f-e2c16e30ccee" containerName="oc" Mar 21 05:10:07 crc kubenswrapper[4775]: E0321 05:10:07.599211 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f713501-fa26-4ef2-87ac-c5376b84891d" containerName="nova-scheduler-scheduler" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.599229 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f713501-fa26-4ef2-87ac-c5376b84891d" containerName="nova-scheduler-scheduler" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.599527 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="33cb4f91-aca0-4221-873f-e2c16e30ccee" containerName="oc" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.599555 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f713501-fa26-4ef2-87ac-c5376b84891d" containerName="nova-scheduler-scheduler" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.600276 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.603175 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.607399 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.672494 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab55731-40da-4831-a8b5-f9c413452367-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.672571 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvqwh\" (UniqueName: \"kubernetes.io/projected/bab55731-40da-4831-a8b5-f9c413452367-kube-api-access-lvqwh\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.672662 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab55731-40da-4831-a8b5-f9c413452367-config-data\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.688902 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f713501-fa26-4ef2-87ac-c5376b84891d" path="/var/lib/kubelet/pods/0f713501-fa26-4ef2-87ac-c5376b84891d/volumes" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.689681 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3f1e05-6056-4a7f-9452-22caa26e74fd" path="/var/lib/kubelet/pods/fb3f1e05-6056-4a7f-9452-22caa26e74fd/volumes" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.774869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab55731-40da-4831-a8b5-f9c413452367-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.775214 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvqwh\" (UniqueName: \"kubernetes.io/projected/bab55731-40da-4831-a8b5-f9c413452367-kube-api-access-lvqwh\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.775244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab55731-40da-4831-a8b5-f9c413452367-config-data\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.780776 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bab55731-40da-4831-a8b5-f9c413452367-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.783021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bab55731-40da-4831-a8b5-f9c413452367-config-data\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:07 crc kubenswrapper[4775]: I0321 05:10:07.799633 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvqwh\" (UniqueName: \"kubernetes.io/projected/bab55731-40da-4831-a8b5-f9c413452367-kube-api-access-lvqwh\") pod \"nova-scheduler-0\" (UID: \"bab55731-40da-4831-a8b5-f9c413452367\") " pod="openstack/nova-scheduler-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.007005 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-jsc7j"] Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.014252 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-jsc7j"] Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.051177 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.239471 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.284311 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-combined-ca-bundle\") pod \"636228b7-669b-4b5d-abec-bf78cb1513f0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.284425 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/636228b7-669b-4b5d-abec-bf78cb1513f0-logs\") pod \"636228b7-669b-4b5d-abec-bf78cb1513f0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.284581 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-config-data\") pod \"636228b7-669b-4b5d-abec-bf78cb1513f0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.284649 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-nova-metadata-tls-certs\") pod \"636228b7-669b-4b5d-abec-bf78cb1513f0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.284738 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plq2h\" (UniqueName: \"kubernetes.io/projected/636228b7-669b-4b5d-abec-bf78cb1513f0-kube-api-access-plq2h\") pod \"636228b7-669b-4b5d-abec-bf78cb1513f0\" (UID: \"636228b7-669b-4b5d-abec-bf78cb1513f0\") " Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.286441 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/636228b7-669b-4b5d-abec-bf78cb1513f0-logs" (OuterVolumeSpecName: "logs") pod "636228b7-669b-4b5d-abec-bf78cb1513f0" (UID: "636228b7-669b-4b5d-abec-bf78cb1513f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.292385 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636228b7-669b-4b5d-abec-bf78cb1513f0-kube-api-access-plq2h" (OuterVolumeSpecName: "kube-api-access-plq2h") pod "636228b7-669b-4b5d-abec-bf78cb1513f0" (UID: "636228b7-669b-4b5d-abec-bf78cb1513f0"). InnerVolumeSpecName "kube-api-access-plq2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.336261 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-config-data" (OuterVolumeSpecName: "config-data") pod "636228b7-669b-4b5d-abec-bf78cb1513f0" (UID: "636228b7-669b-4b5d-abec-bf78cb1513f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.368426 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "636228b7-669b-4b5d-abec-bf78cb1513f0" (UID: "636228b7-669b-4b5d-abec-bf78cb1513f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.387997 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "636228b7-669b-4b5d-abec-bf78cb1513f0" (UID: "636228b7-669b-4b5d-abec-bf78cb1513f0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.390491 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.390515 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plq2h\" (UniqueName: \"kubernetes.io/projected/636228b7-669b-4b5d-abec-bf78cb1513f0-kube-api-access-plq2h\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.390524 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.390533 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/636228b7-669b-4b5d-abec-bf78cb1513f0-logs\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.390542 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/636228b7-669b-4b5d-abec-bf78cb1513f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.531746 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.543049 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"886c404c-ceec-48e7-90da-96d6aa201152","Type":"ContainerStarted","Data":"e15bb364282c0ba2931783f4dafe88fe9283770f5c4f7fa427724353592f1c45"} Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.543101 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"886c404c-ceec-48e7-90da-96d6aa201152","Type":"ContainerStarted","Data":"01cbfe40ccd945f67c2bba82115e395d0b9610b14660c86f3c227e8c213758b1"} Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.543130 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"886c404c-ceec-48e7-90da-96d6aa201152","Type":"ContainerStarted","Data":"2d7a24379beecfe6c23a41760c2ac92eb37fa2a9f779e013f2c6319dd30ffc87"} Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.550056 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bab55731-40da-4831-a8b5-f9c413452367","Type":"ContainerStarted","Data":"54239add11ebd7fbe13885f3881605227b4aa3474202a89dd331c69292908f27"} Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.552184 4775 generic.go:334] "Generic (PLEG): container finished" podID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerID="88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc" exitCode=0 Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.552239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"636228b7-669b-4b5d-abec-bf78cb1513f0","Type":"ContainerDied","Data":"88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc"} Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.552257 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"636228b7-669b-4b5d-abec-bf78cb1513f0","Type":"ContainerDied","Data":"c6bcff38ed2ee79c0028a43362b2200b49c362342fd98956493476b35ddff899"} Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.552273 4775 scope.go:117] "RemoveContainer" containerID="88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.552547 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.588653 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.588614888 podStartE2EDuration="2.588614888s" podCreationTimestamp="2026-03-21 05:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:10:08.568620503 +0000 UTC m=+1361.545084127" watchObservedRunningTime="2026-03-21 05:10:08.588614888 +0000 UTC m=+1361.565078512" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.707514 4775 scope.go:117] "RemoveContainer" containerID="9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.748826 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.749896 4775 scope.go:117] "RemoveContainer" containerID="88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc" Mar 21 05:10:08 crc kubenswrapper[4775]: E0321 05:10:08.752524 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc\": container with ID starting with 88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc not found: ID does not exist" containerID="88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.752568 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc"} err="failed to get container status \"88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc\": rpc error: code = NotFound desc = could not find container \"88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc\": container with ID starting with 88b942c0d478eda3ab139d99845879b76d209947d510208b64a88f8f909afefc not found: ID does not exist" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.752595 4775 scope.go:117] "RemoveContainer" containerID="9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc" Mar 21 05:10:08 crc kubenswrapper[4775]: E0321 05:10:08.757226 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc\": container with ID starting with 9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc not found: ID does not exist" containerID="9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.757267 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc"} err="failed to get container status \"9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc\": rpc error: code = NotFound desc = could not find container \"9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc\": container with ID starting with 9a436c521b9cde86255d5d1bb86e7e7617657d67544c27a71666b10506ae93dc not found: ID does not exist" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.762184 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.770051 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:10:08 crc kubenswrapper[4775]: E0321 05:10:08.770451 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-metadata" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.770467 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-metadata" Mar 21 05:10:08 crc kubenswrapper[4775]: E0321 05:10:08.770481 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-log" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.770487 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-log" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.770655 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-metadata" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.770676 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" containerName="nova-metadata-log" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.771904 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.775156 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.790307 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.800246 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnlxt\" (UniqueName: \"kubernetes.io/projected/208cfa71-8242-4958-b9db-21fc180a6697-kube-api-access-tnlxt\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.800342 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.800388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208cfa71-8242-4958-b9db-21fc180a6697-logs\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.800430 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-config-data\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.800456 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.804863 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.902232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnlxt\" (UniqueName: \"kubernetes.io/projected/208cfa71-8242-4958-b9db-21fc180a6697-kube-api-access-tnlxt\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.902337 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.902390 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208cfa71-8242-4958-b9db-21fc180a6697-logs\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.902433 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-config-data\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.902463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.902904 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/208cfa71-8242-4958-b9db-21fc180a6697-logs\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.907491 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.907525 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-config-data\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.909601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208cfa71-8242-4958-b9db-21fc180a6697-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:08 crc kubenswrapper[4775]: I0321 05:10:08.923506 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnlxt\" (UniqueName: \"kubernetes.io/projected/208cfa71-8242-4958-b9db-21fc180a6697-kube-api-access-tnlxt\") pod \"nova-metadata-0\" (UID: \"208cfa71-8242-4958-b9db-21fc180a6697\") " pod="openstack/nova-metadata-0" Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.116649 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.560858 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.578311 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208cfa71-8242-4958-b9db-21fc180a6697","Type":"ContainerStarted","Data":"bef23ce3537e304a8440703679a26f74fdb34f50019dcbd73a0a31c53033d754"} Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.581382 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bab55731-40da-4831-a8b5-f9c413452367","Type":"ContainerStarted","Data":"1bf60e4668a1f23910cdef9ea4cd616c9ffe8afc8a12261f33ae430c857beb32"} Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.590178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerStarted","Data":"ef9369d555365416b278955eec54a4b9a28ad97b7c0e8c0f39d301bd6d2b5a92"} Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.590683 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.603279 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.603256831 podStartE2EDuration="2.603256831s" podCreationTimestamp="2026-03-21 05:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:10:09.595892253 +0000 UTC m=+1362.572355887" watchObservedRunningTime="2026-03-21 05:10:09.603256831 +0000 UTC m=+1362.579720465" Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.647154 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.956140758 podStartE2EDuration="9.647132452s" podCreationTimestamp="2026-03-21 05:10:00 +0000 UTC" firstStartedPulling="2026-03-21 05:10:01.330208553 +0000 UTC m=+1354.306672177" lastFinishedPulling="2026-03-21 05:10:09.021200247 +0000 UTC m=+1361.997663871" observedRunningTime="2026-03-21 05:10:09.639390993 +0000 UTC m=+1362.615854617" watchObservedRunningTime="2026-03-21 05:10:09.647132452 +0000 UTC m=+1362.623596076" Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.677708 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636228b7-669b-4b5d-abec-bf78cb1513f0" path="/var/lib/kubelet/pods/636228b7-669b-4b5d-abec-bf78cb1513f0/volumes" Mar 21 05:10:09 crc kubenswrapper[4775]: I0321 05:10:09.678261 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c861c77c-15ba-4204-8603-e6093bc1a0b8" path="/var/lib/kubelet/pods/c861c77c-15ba-4204-8603-e6093bc1a0b8/volumes" Mar 21 05:10:10 crc kubenswrapper[4775]: I0321 05:10:10.602850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208cfa71-8242-4958-b9db-21fc180a6697","Type":"ContainerStarted","Data":"710f8b630e8fd4d943202a35b0f5180cd356bcfa3ee0c35b8ed1b0b486c93e15"} Mar 21 05:10:11 crc kubenswrapper[4775]: I0321 05:10:11.613559 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"208cfa71-8242-4958-b9db-21fc180a6697","Type":"ContainerStarted","Data":"f6c428a9e20773a6c9af646402b83b649bee9c253793ddd00227e15c99dfbb48"} Mar 21 05:10:11 crc kubenswrapper[4775]: I0321 05:10:11.636216 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.636178672 podStartE2EDuration="3.636178672s" podCreationTimestamp="2026-03-21 05:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:10:11.629594246 +0000 UTC m=+1364.606057890" watchObservedRunningTime="2026-03-21 05:10:11.636178672 +0000 UTC m=+1364.612642316" Mar 21 05:10:13 crc kubenswrapper[4775]: I0321 05:10:13.052030 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 05:10:16 crc kubenswrapper[4775]: I0321 05:10:16.894263 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:10:16 crc kubenswrapper[4775]: I0321 05:10:16.894754 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 05:10:17 crc kubenswrapper[4775]: I0321 05:10:17.909364 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="886c404c-ceec-48e7-90da-96d6aa201152" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:10:17 crc kubenswrapper[4775]: I0321 05:10:17.910936 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="886c404c-ceec-48e7-90da-96d6aa201152" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:10:18 crc kubenswrapper[4775]: I0321 05:10:18.052778 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 05:10:18 crc kubenswrapper[4775]: I0321 05:10:18.078222 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 05:10:18 crc kubenswrapper[4775]: I0321 05:10:18.727277 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 05:10:19 crc kubenswrapper[4775]: I0321 05:10:19.117720 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:10:19 crc kubenswrapper[4775]: I0321 05:10:19.117837 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 05:10:20 crc kubenswrapper[4775]: I0321 05:10:20.128514 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="208cfa71-8242-4958-b9db-21fc180a6697" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:10:20 crc kubenswrapper[4775]: I0321 05:10:20.128551 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="208cfa71-8242-4958-b9db-21fc180a6697" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 05:10:20 crc kubenswrapper[4775]: I0321 05:10:20.568521 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e3dd1c1b-7aba-4083-8a9c-037cd38b2d53" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.194:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 05:10:24 crc kubenswrapper[4775]: I0321 05:10:24.894566 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:10:24 crc kubenswrapper[4775]: I0321 05:10:24.895155 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 05:10:27 crc kubenswrapper[4775]: I0321 05:10:27.000513 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:10:27 crc kubenswrapper[4775]: I0321 05:10:27.003113 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 05:10:27 crc kubenswrapper[4775]: I0321 05:10:27.005886 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:10:27 crc kubenswrapper[4775]: I0321 05:10:27.117620 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:10:27 crc kubenswrapper[4775]: I0321 05:10:27.117769 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 05:10:27 crc kubenswrapper[4775]: I0321 05:10:27.804705 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 05:10:29 crc kubenswrapper[4775]: I0321 05:10:29.124876 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:10:29 crc kubenswrapper[4775]: I0321 05:10:29.126454 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 05:10:29 crc kubenswrapper[4775]: I0321 05:10:29.130227 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:10:29 crc kubenswrapper[4775]: I0321 05:10:29.130880 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 05:10:31 crc kubenswrapper[4775]: I0321 05:10:31.192998 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 05:10:32 crc kubenswrapper[4775]: I0321 05:10:32.482907 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:10:32 crc kubenswrapper[4775]: I0321 05:10:32.483324 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:10:34 crc kubenswrapper[4775]: I0321 05:10:34.870562 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:34 crc kubenswrapper[4775]: I0321 05:10:34.871160 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="31578b15-f84b-4862-ae52-6720dac8f5e2" containerName="kube-state-metrics" containerID="cri-o://ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a" gracePeriod=30 Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.398246 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.496552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6zwf\" (UniqueName: \"kubernetes.io/projected/31578b15-f84b-4862-ae52-6720dac8f5e2-kube-api-access-l6zwf\") pod \"31578b15-f84b-4862-ae52-6720dac8f5e2\" (UID: \"31578b15-f84b-4862-ae52-6720dac8f5e2\") " Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.515349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31578b15-f84b-4862-ae52-6720dac8f5e2-kube-api-access-l6zwf" (OuterVolumeSpecName: "kube-api-access-l6zwf") pod "31578b15-f84b-4862-ae52-6720dac8f5e2" (UID: "31578b15-f84b-4862-ae52-6720dac8f5e2"). InnerVolumeSpecName "kube-api-access-l6zwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.599491 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6zwf\" (UniqueName: \"kubernetes.io/projected/31578b15-f84b-4862-ae52-6720dac8f5e2-kube-api-access-l6zwf\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.895732 4775 generic.go:334] "Generic (PLEG): container finished" podID="31578b15-f84b-4862-ae52-6720dac8f5e2" containerID="ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a" exitCode=2 Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.895768 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"31578b15-f84b-4862-ae52-6720dac8f5e2","Type":"ContainerDied","Data":"ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a"} Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.895794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"31578b15-f84b-4862-ae52-6720dac8f5e2","Type":"ContainerDied","Data":"95eba21f713625c8d0e5f46fb846b1edb0522a4501294e1679ea681e44f67781"} Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.895811 4775 scope.go:117] "RemoveContainer" containerID="ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.895815 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.920300 4775 scope.go:117] "RemoveContainer" containerID="ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a" Mar 21 05:10:35 crc kubenswrapper[4775]: E0321 05:10:35.920681 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a\": container with ID starting with ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a not found: ID does not exist" containerID="ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.920718 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a"} err="failed to get container status \"ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a\": rpc error: code = NotFound desc = could not find container \"ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a\": container with ID starting with ebdac31be6ec007bb4ed96d43a4d78aab3ca618e5b3cc6ddb93c2810c9ec401a not found: ID does not exist" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.923137 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.935533 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.949000 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:35 crc kubenswrapper[4775]: E0321 05:10:35.949538 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31578b15-f84b-4862-ae52-6720dac8f5e2" containerName="kube-state-metrics" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.949560 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31578b15-f84b-4862-ae52-6720dac8f5e2" containerName="kube-state-metrics" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.949801 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="31578b15-f84b-4862-ae52-6720dac8f5e2" containerName="kube-state-metrics" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.950581 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.952939 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.953289 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 21 05:10:35 crc kubenswrapper[4775]: I0321 05:10:35.982218 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.107864 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.107934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.107966 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.107996 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzt24\" (UniqueName: \"kubernetes.io/projected/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-api-access-nzt24\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.210916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.211350 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.211419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.211479 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzt24\" (UniqueName: \"kubernetes.io/projected/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-api-access-nzt24\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.217304 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.220358 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.221208 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.236783 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzt24\" (UniqueName: \"kubernetes.io/projected/7ee3b7a0-9eb3-4702-8fb7-3286df60b21b-kube-api-access-nzt24\") pod \"kube-state-metrics-0\" (UID: \"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b\") " pod="openstack/kube-state-metrics-0" Mar 21 05:10:36 crc kubenswrapper[4775]: I0321 05:10:36.265212 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.731450 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.732387 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-central-agent" containerID="cri-o://8f8cabcb03903e4d89761a18703dd5c97870cd2e3d0ef375a4910ff30a8d99cf" gracePeriod=30 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.732834 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="proxy-httpd" containerID="cri-o://ef9369d555365416b278955eec54a4b9a28ad97b7c0e8c0f39d301bd6d2b5a92" gracePeriod=30 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.732883 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="sg-core" containerID="cri-o://02056fc0b14ba46d658b8a5aac716fe485d874dd4d3feb41a0e56ea871cdd2a0" gracePeriod=30 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.732916 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-notification-agent" containerID="cri-o://5c6a6597ad7154fd7a253fa86eb6c5cde8565f20b6949e97002d664b75764604" gracePeriod=30 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.759628 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.919231 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b","Type":"ContainerStarted","Data":"e6fcb40947e0a44da02247b3001dd7bbad982138ad811a2a9dff2ffa3dc5b034"} Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.922044 4775 generic.go:334] "Generic (PLEG): container finished" podID="7fdf78a6-b107-4949-affa-6152d15afda0" containerID="02056fc0b14ba46d658b8a5aac716fe485d874dd4d3feb41a0e56ea871cdd2a0" exitCode=2 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:36.922146 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerDied","Data":"02056fc0b14ba46d658b8a5aac716fe485d874dd4d3feb41a0e56ea871cdd2a0"} Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.679311 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31578b15-f84b-4862-ae52-6720dac8f5e2" path="/var/lib/kubelet/pods/31578b15-f84b-4862-ae52-6720dac8f5e2/volumes" Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.937657 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7ee3b7a0-9eb3-4702-8fb7-3286df60b21b","Type":"ContainerStarted","Data":"3e25aa26aea86ebe53beaeba095f2cca5eae987565371ea0d1d0d037281bc4a1"} Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.938095 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.941978 4775 generic.go:334] "Generic (PLEG): container finished" podID="7fdf78a6-b107-4949-affa-6152d15afda0" containerID="ef9369d555365416b278955eec54a4b9a28ad97b7c0e8c0f39d301bd6d2b5a92" exitCode=0 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.942010 4775 generic.go:334] "Generic (PLEG): container finished" podID="7fdf78a6-b107-4949-affa-6152d15afda0" containerID="5c6a6597ad7154fd7a253fa86eb6c5cde8565f20b6949e97002d664b75764604" exitCode=0 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.942018 4775 generic.go:334] "Generic (PLEG): container finished" podID="7fdf78a6-b107-4949-affa-6152d15afda0" containerID="8f8cabcb03903e4d89761a18703dd5c97870cd2e3d0ef375a4910ff30a8d99cf" exitCode=0 Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.942040 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerDied","Data":"ef9369d555365416b278955eec54a4b9a28ad97b7c0e8c0f39d301bd6d2b5a92"} Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.942076 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerDied","Data":"5c6a6597ad7154fd7a253fa86eb6c5cde8565f20b6949e97002d664b75764604"} Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.942098 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerDied","Data":"8f8cabcb03903e4d89761a18703dd5c97870cd2e3d0ef375a4910ff30a8d99cf"} Mar 21 05:10:37 crc kubenswrapper[4775]: I0321 05:10:37.964045 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.35142522 podStartE2EDuration="2.964019698s" podCreationTimestamp="2026-03-21 05:10:35 +0000 UTC" firstStartedPulling="2026-03-21 05:10:36.766468843 +0000 UTC m=+1389.742932467" lastFinishedPulling="2026-03-21 05:10:37.379063321 +0000 UTC m=+1390.355526945" observedRunningTime="2026-03-21 05:10:37.955390244 +0000 UTC m=+1390.931853878" watchObservedRunningTime="2026-03-21 05:10:37.964019698 +0000 UTC m=+1390.940483332" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.088446 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.183482 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-run-httpd\") pod \"7fdf78a6-b107-4949-affa-6152d15afda0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.183589 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-combined-ca-bundle\") pod \"7fdf78a6-b107-4949-affa-6152d15afda0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.183662 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-config-data\") pod \"7fdf78a6-b107-4949-affa-6152d15afda0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.183769 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-scripts\") pod \"7fdf78a6-b107-4949-affa-6152d15afda0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.183799 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-log-httpd\") pod \"7fdf78a6-b107-4949-affa-6152d15afda0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.183867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk4fk\" (UniqueName: \"kubernetes.io/projected/7fdf78a6-b107-4949-affa-6152d15afda0-kube-api-access-lk4fk\") pod \"7fdf78a6-b107-4949-affa-6152d15afda0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.185522 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-sg-core-conf-yaml\") pod \"7fdf78a6-b107-4949-affa-6152d15afda0\" (UID: \"7fdf78a6-b107-4949-affa-6152d15afda0\") " Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.183903 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7fdf78a6-b107-4949-affa-6152d15afda0" (UID: "7fdf78a6-b107-4949-affa-6152d15afda0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.185422 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7fdf78a6-b107-4949-affa-6152d15afda0" (UID: "7fdf78a6-b107-4949-affa-6152d15afda0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.187004 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.187098 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fdf78a6-b107-4949-affa-6152d15afda0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.190156 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-scripts" (OuterVolumeSpecName: "scripts") pod "7fdf78a6-b107-4949-affa-6152d15afda0" (UID: "7fdf78a6-b107-4949-affa-6152d15afda0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.191746 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fdf78a6-b107-4949-affa-6152d15afda0-kube-api-access-lk4fk" (OuterVolumeSpecName: "kube-api-access-lk4fk") pod "7fdf78a6-b107-4949-affa-6152d15afda0" (UID: "7fdf78a6-b107-4949-affa-6152d15afda0"). InnerVolumeSpecName "kube-api-access-lk4fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.222723 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7fdf78a6-b107-4949-affa-6152d15afda0" (UID: "7fdf78a6-b107-4949-affa-6152d15afda0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.286500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fdf78a6-b107-4949-affa-6152d15afda0" (UID: "7fdf78a6-b107-4949-affa-6152d15afda0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.288694 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.288725 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.288735 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk4fk\" (UniqueName: \"kubernetes.io/projected/7fdf78a6-b107-4949-affa-6152d15afda0-kube-api-access-lk4fk\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.288748 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.301845 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-config-data" (OuterVolumeSpecName: "config-data") pod "7fdf78a6-b107-4949-affa-6152d15afda0" (UID: "7fdf78a6-b107-4949-affa-6152d15afda0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.390099 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdf78a6-b107-4949-affa-6152d15afda0-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.951413 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fdf78a6-b107-4949-affa-6152d15afda0","Type":"ContainerDied","Data":"0199a14e4f5a12a8019f727e78e7f15430a4ee84de969cda88096ca15bbed15b"} Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.951706 4775 scope.go:117] "RemoveContainer" containerID="ef9369d555365416b278955eec54a4b9a28ad97b7c0e8c0f39d301bd6d2b5a92" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.951459 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.980646 4775 scope.go:117] "RemoveContainer" containerID="02056fc0b14ba46d658b8a5aac716fe485d874dd4d3feb41a0e56ea871cdd2a0" Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.992375 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:38 crc kubenswrapper[4775]: I0321 05:10:38.998095 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.004126 4775 scope.go:117] "RemoveContainer" containerID="5c6a6597ad7154fd7a253fa86eb6c5cde8565f20b6949e97002d664b75764604" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.025600 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:39 crc kubenswrapper[4775]: E0321 05:10:39.026088 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-notification-agent" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026109 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-notification-agent" Mar 21 05:10:39 crc kubenswrapper[4775]: E0321 05:10:39.026145 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-central-agent" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026155 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-central-agent" Mar 21 05:10:39 crc kubenswrapper[4775]: E0321 05:10:39.026186 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="sg-core" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026196 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="sg-core" Mar 21 05:10:39 crc kubenswrapper[4775]: E0321 05:10:39.026212 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="proxy-httpd" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026219 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="proxy-httpd" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026440 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="proxy-httpd" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026467 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-central-agent" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026487 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="sg-core" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.026507 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" containerName="ceilometer-notification-agent" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.028697 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.030420 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.032379 4775 scope.go:117] "RemoveContainer" containerID="8f8cabcb03903e4d89761a18703dd5c97870cd2e3d0ef375a4910ff30a8d99cf" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.032661 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.033051 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.051518 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.104778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-scripts\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.104867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59bs7\" (UniqueName: \"kubernetes.io/projected/15d97495-428d-47e0-a115-99c7fd08850a-kube-api-access-59bs7\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.104889 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d97495-428d-47e0-a115-99c7fd08850a-run-httpd\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.104911 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d97495-428d-47e0-a115-99c7fd08850a-log-httpd\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.105179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.105298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.105379 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-config-data\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.105406 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.206920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.206981 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-config-data\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.207003 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.207042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-scripts\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.207106 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59bs7\" (UniqueName: \"kubernetes.io/projected/15d97495-428d-47e0-a115-99c7fd08850a-kube-api-access-59bs7\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.207153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d97495-428d-47e0-a115-99c7fd08850a-run-httpd\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.207172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d97495-428d-47e0-a115-99c7fd08850a-log-httpd\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.207253 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.208989 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d97495-428d-47e0-a115-99c7fd08850a-run-httpd\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.208994 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d97495-428d-47e0-a115-99c7fd08850a-log-httpd\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.213847 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.214225 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.220635 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-config-data\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.226547 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.228723 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d97495-428d-47e0-a115-99c7fd08850a-scripts\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.230625 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59bs7\" (UniqueName: \"kubernetes.io/projected/15d97495-428d-47e0-a115-99c7fd08850a-kube-api-access-59bs7\") pod \"ceilometer-0\" (UID: \"15d97495-428d-47e0-a115-99c7fd08850a\") " pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.358657 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.678897 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fdf78a6-b107-4949-affa-6152d15afda0" path="/var/lib/kubelet/pods/7fdf78a6-b107-4949-affa-6152d15afda0/volumes" Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.813844 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 05:10:39 crc kubenswrapper[4775]: W0321 05:10:39.816946 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15d97495_428d_47e0_a115_99c7fd08850a.slice/crio-1622581afed8ba9cf1c1c7cb429c446134537918cd31333b20a211b831399aa7 WatchSource:0}: Error finding container 1622581afed8ba9cf1c1c7cb429c446134537918cd31333b20a211b831399aa7: Status 404 returned error can't find the container with id 1622581afed8ba9cf1c1c7cb429c446134537918cd31333b20a211b831399aa7 Mar 21 05:10:39 crc kubenswrapper[4775]: I0321 05:10:39.963272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d97495-428d-47e0-a115-99c7fd08850a","Type":"ContainerStarted","Data":"1622581afed8ba9cf1c1c7cb429c446134537918cd31333b20a211b831399aa7"} Mar 21 05:10:41 crc kubenswrapper[4775]: I0321 05:10:41.984006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d97495-428d-47e0-a115-99c7fd08850a","Type":"ContainerStarted","Data":"24e9203926ce20d73abad431ad234bd89366d9c12b6fe8212d5a1ee4d29a327c"} Mar 21 05:10:41 crc kubenswrapper[4775]: I0321 05:10:41.984602 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d97495-428d-47e0-a115-99c7fd08850a","Type":"ContainerStarted","Data":"4d7ec2197c044f2c92cf604a4c8062ca6bb12683a9edaf586742518fe3f3dabf"} Mar 21 05:10:42 crc kubenswrapper[4775]: I0321 05:10:42.139396 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:42 crc kubenswrapper[4775]: I0321 05:10:42.573862 4775 scope.go:117] "RemoveContainer" containerID="ca9c08541581bc07e396b682f2ba269a753d458f0a5cc3ff5f26ace1828dea81" Mar 21 05:10:42 crc kubenswrapper[4775]: I0321 05:10:42.995926 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d97495-428d-47e0-a115-99c7fd08850a","Type":"ContainerStarted","Data":"2ddaecf980b5884d76751d1c960d095146d81d5cd9ff2519987921267a5e42d3"} Mar 21 05:10:43 crc kubenswrapper[4775]: I0321 05:10:43.248325 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:45 crc kubenswrapper[4775]: I0321 05:10:45.028454 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d97495-428d-47e0-a115-99c7fd08850a","Type":"ContainerStarted","Data":"3deed700d08f13985c8e9b21af8cafbca8a7db18fd1bc5bc6c6276a4d26500ae"} Mar 21 05:10:45 crc kubenswrapper[4775]: I0321 05:10:45.030102 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 05:10:45 crc kubenswrapper[4775]: I0321 05:10:45.066455 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.876802602 podStartE2EDuration="7.066433712s" podCreationTimestamp="2026-03-21 05:10:38 +0000 UTC" firstStartedPulling="2026-03-21 05:10:39.820386857 +0000 UTC m=+1392.796850481" lastFinishedPulling="2026-03-21 05:10:44.010017967 +0000 UTC m=+1396.986481591" observedRunningTime="2026-03-21 05:10:45.056715247 +0000 UTC m=+1398.033178881" watchObservedRunningTime="2026-03-21 05:10:45.066433712 +0000 UTC m=+1398.042897336" Mar 21 05:10:46 crc kubenswrapper[4775]: I0321 05:10:46.296575 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 05:10:46 crc kubenswrapper[4775]: I0321 05:10:46.741364 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="839e915e-8197-48e9-8b69-56ac420a1eed" containerName="rabbitmq" containerID="cri-o://c1cc3f5c012ce3f528f11e083192eb0842f7dad8501df783855a594cab5449da" gracePeriod=604796 Mar 21 05:10:47 crc kubenswrapper[4775]: I0321 05:10:47.742009 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerName="rabbitmq" containerID="cri-o://2b205fc4ed697ad5cbf0298ac1f6315a11af8ff066df4aa11b086ace7ab697d3" gracePeriod=604796 Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.100552 4775 generic.go:334] "Generic (PLEG): container finished" podID="839e915e-8197-48e9-8b69-56ac420a1eed" containerID="c1cc3f5c012ce3f528f11e083192eb0842f7dad8501df783855a594cab5449da" exitCode=0 Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.100571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"839e915e-8197-48e9-8b69-56ac420a1eed","Type":"ContainerDied","Data":"c1cc3f5c012ce3f528f11e083192eb0842f7dad8501df783855a594cab5449da"} Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.356065 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.377428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-plugins\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.377477 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-plugins-conf\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.377554 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-server-conf\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.378234 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmtzj\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-kube-api-access-lmtzj\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.378327 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-erlang-cookie\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.378453 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-confd\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.378498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-config-data\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.378790 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.379016 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.379277 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.385027 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-kube-api-access-lmtzj" (OuterVolumeSpecName: "kube-api-access-lmtzj") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "kube-api-access-lmtzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.392572 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.392670 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/839e915e-8197-48e9-8b69-56ac420a1eed-erlang-cookie-secret\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.392715 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/839e915e-8197-48e9-8b69-56ac420a1eed-pod-info\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.392777 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-tls\") pod \"839e915e-8197-48e9-8b69-56ac420a1eed\" (UID: \"839e915e-8197-48e9-8b69-56ac420a1eed\") " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.395837 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.395910 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.395926 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmtzj\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-kube-api-access-lmtzj\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.395944 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.403591 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/839e915e-8197-48e9-8b69-56ac420a1eed-pod-info" (OuterVolumeSpecName: "pod-info") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.403743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.412296 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839e915e-8197-48e9-8b69-56ac420a1eed-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.412594 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.446457 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-config-data" (OuterVolumeSpecName: "config-data") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.487234 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-server-conf" (OuterVolumeSpecName: "server-conf") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.497580 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.497616 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/839e915e-8197-48e9-8b69-56ac420a1eed-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.497645 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.497658 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/839e915e-8197-48e9-8b69-56ac420a1eed-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.497669 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/839e915e-8197-48e9-8b69-56ac420a1eed-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.497679 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.584334 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.586291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "839e915e-8197-48e9-8b69-56ac420a1eed" (UID: "839e915e-8197-48e9-8b69-56ac420a1eed"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.614476 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:53 crc kubenswrapper[4775]: I0321 05:10:53.614781 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/839e915e-8197-48e9-8b69-56ac420a1eed-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.116466 4775 generic.go:334] "Generic (PLEG): container finished" podID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerID="2b205fc4ed697ad5cbf0298ac1f6315a11af8ff066df4aa11b086ace7ab697d3" exitCode=0 Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.116572 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"375fb8b7-b673-4fd7-ae51-5f82f33c196f","Type":"ContainerDied","Data":"2b205fc4ed697ad5cbf0298ac1f6315a11af8ff066df4aa11b086ace7ab697d3"} Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.120412 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"839e915e-8197-48e9-8b69-56ac420a1eed","Type":"ContainerDied","Data":"ef63fbfadda52109f2b3ecef0f45ed06c3a90225a2c4d5559784277a4c40c9dd"} Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.120478 4775 scope.go:117] "RemoveContainer" containerID="c1cc3f5c012ce3f528f11e083192eb0842f7dad8501df783855a594cab5449da" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.120483 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.159710 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.175212 4775 scope.go:117] "RemoveContainer" containerID="5a7046cba7a8f5bae4cf5ebd270104b81a62d747ce978d3680ad2d2e0a16d243" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.175366 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.231247 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:54 crc kubenswrapper[4775]: E0321 05:10:54.231713 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839e915e-8197-48e9-8b69-56ac420a1eed" containerName="rabbitmq" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.231726 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="839e915e-8197-48e9-8b69-56ac420a1eed" containerName="rabbitmq" Mar 21 05:10:54 crc kubenswrapper[4775]: E0321 05:10:54.231742 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839e915e-8197-48e9-8b69-56ac420a1eed" containerName="setup-container" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.231751 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="839e915e-8197-48e9-8b69-56ac420a1eed" containerName="setup-container" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.231961 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="839e915e-8197-48e9-8b69-56ac420a1eed" containerName="rabbitmq" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.232901 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.235661 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.235879 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.236196 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.236305 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.236466 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.236977 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kmjnx" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.237136 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.246158 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341040 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8t8r\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-kube-api-access-c8t8r\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341082 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341139 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5e83941-a38d-4ee9-b967-1dac69c5a55b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341408 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5e83941-a38d-4ee9-b967-1dac69c5a55b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341451 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341773 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341826 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341935 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341973 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.341997 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.398182 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443113 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-erlang-cookie\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443195 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-plugins\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443244 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443364 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375fb8b7-b673-4fd7-ae51-5f82f33c196f-pod-info\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-config-data\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443427 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-tls\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443452 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-confd\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443514 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdt6\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-kube-api-access-vqdt6\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-plugins-conf\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443580 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375fb8b7-b673-4fd7-ae51-5f82f33c196f-erlang-cookie-secret\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443598 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-server-conf\") pod \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\" (UID: \"375fb8b7-b673-4fd7-ae51-5f82f33c196f\") " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443924 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.443966 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444020 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8t8r\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-kube-api-access-c8t8r\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444094 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5e83941-a38d-4ee9-b967-1dac69c5a55b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444229 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444251 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5e83941-a38d-4ee9-b967-1dac69c5a55b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444270 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444387 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.444416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.445774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.446154 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.446615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.446841 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.447079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.447805 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.447909 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5e83941-a38d-4ee9-b967-1dac69c5a55b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.448702 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.450149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.454885 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5e83941-a38d-4ee9-b967-1dac69c5a55b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.458667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5e83941-a38d-4ee9-b967-1dac69c5a55b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.460890 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-kube-api-access-vqdt6" (OuterVolumeSpecName: "kube-api-access-vqdt6") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "kube-api-access-vqdt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.462108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.462132 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.463068 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/375fb8b7-b673-4fd7-ae51-5f82f33c196f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.473300 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.473550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8t8r\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-kube-api-access-c8t8r\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.473938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5e83941-a38d-4ee9-b967-1dac69c5a55b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.474790 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/375fb8b7-b673-4fd7-ae51-5f82f33c196f-pod-info" (OuterVolumeSpecName: "pod-info") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.517233 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-config-data" (OuterVolumeSpecName: "config-data") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.536981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"e5e83941-a38d-4ee9-b967-1dac69c5a55b\") " pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546068 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375fb8b7-b673-4fd7-ae51-5f82f33c196f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546109 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546137 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546149 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqdt6\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-kube-api-access-vqdt6\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546161 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546172 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375fb8b7-b673-4fd7-ae51-5f82f33c196f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546183 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546194 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.546221 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.560496 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-server-conf" (OuterVolumeSpecName: "server-conf") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.571854 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.582177 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.594074 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "375fb8b7-b673-4fd7-ae51-5f82f33c196f" (UID: "375fb8b7-b673-4fd7-ae51-5f82f33c196f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.647609 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.647642 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375fb8b7-b673-4fd7-ae51-5f82f33c196f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:54 crc kubenswrapper[4775]: I0321 05:10:54.647653 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375fb8b7-b673-4fd7-ae51-5f82f33c196f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.004810 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-9m26s"] Mar 21 05:10:55 crc kubenswrapper[4775]: E0321 05:10:55.005576 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerName="rabbitmq" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.005596 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerName="rabbitmq" Mar 21 05:10:55 crc kubenswrapper[4775]: E0321 05:10:55.005613 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerName="setup-container" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.005622 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerName="setup-container" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.005848 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" containerName="rabbitmq" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.007012 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.011018 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.019591 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-9m26s"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.056633 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.056711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.056737 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq54r\" (UniqueName: \"kubernetes.io/projected/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-kube-api-access-nq54r\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.056876 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.056927 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-svc\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.056947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-config\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.056983 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.078249 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.150976 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5e83941-a38d-4ee9-b967-1dac69c5a55b","Type":"ContainerStarted","Data":"9f1c3a538f47d58bc76013b107bdff68081d9c8cfd6e7ffdae9680d305e10223"} Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.156106 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"375fb8b7-b673-4fd7-ae51-5f82f33c196f","Type":"ContainerDied","Data":"56ec7d33ff14d09de4f061f404a6116851871ee2ca592e6f5daea8ccedd576e8"} Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.156170 4775 scope.go:117] "RemoveContainer" containerID="2b205fc4ed697ad5cbf0298ac1f6315a11af8ff066df4aa11b086ace7ab697d3" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.156207 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.158318 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.158366 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-svc\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.158384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-config\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.158413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.158457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.158481 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.158495 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq54r\" (UniqueName: \"kubernetes.io/projected/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-kube-api-access-nq54r\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.159582 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.160090 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.160940 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.162533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-config\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.163200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.163860 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-svc\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.164286 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-9m26s"] Mar 21 05:10:55 crc kubenswrapper[4775]: E0321 05:10:55.167599 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-nq54r], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-d558885bc-9m26s" podUID="eb994900-6140-4a10-aafd-0e9a6d6dfcc9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.186721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq54r\" (UniqueName: \"kubernetes.io/projected/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-kube-api-access-nq54r\") pod \"dnsmasq-dns-d558885bc-9m26s\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.203672 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-mmgp9"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.205549 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.226786 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-mmgp9"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.250690 4775 scope.go:117] "RemoveContainer" containerID="1c4292485e2c0ee4f0c83f962fffc48a32f990f8105220ac94e44a4691b7ff1f" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.259766 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.259827 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-config\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.259886 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghvj6\" (UniqueName: \"kubernetes.io/projected/4918607c-6074-4fb3-a0a0-8def479058a0-kube-api-access-ghvj6\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.259905 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.259924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.259959 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.260037 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.295359 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.311160 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.338332 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.343658 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.346504 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.346980 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.347848 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.347934 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.348038 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.348180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.348343 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5pn7h" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.356084 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426296 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426381 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426483 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426661 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426774 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dn4q\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-kube-api-access-2dn4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426907 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.426960 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427012 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427044 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-config\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427103 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c95486b5-f2ad-4098-912d-6749b329824b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427180 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghvj6\" (UniqueName: \"kubernetes.io/projected/4918607c-6074-4fb3-a0a0-8def479058a0-kube-api-access-ghvj6\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427302 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c95486b5-f2ad-4098-912d-6749b329824b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427338 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.427368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.428384 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.428721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.429021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.430920 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.431050 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-config\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.431211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4918607c-6074-4fb3-a0a0-8def479058a0-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.445448 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghvj6\" (UniqueName: \"kubernetes.io/projected/4918607c-6074-4fb3-a0a0-8def479058a0-kube-api-access-ghvj6\") pod \"dnsmasq-dns-78c64bc9c5-mmgp9\" (UID: \"4918607c-6074-4fb3-a0a0-8def479058a0\") " pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528569 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528619 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528652 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dn4q\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-kube-api-access-2dn4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528710 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528729 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528749 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528768 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c95486b5-f2ad-4098-912d-6749b329824b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528793 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c95486b5-f2ad-4098-912d-6749b329824b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.528869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.530174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.530387 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.530688 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.530726 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c95486b5-f2ad-4098-912d-6749b329824b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.530876 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.532087 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.532450 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c95486b5-f2ad-4098-912d-6749b329824b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.534280 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.534925 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.537839 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c95486b5-f2ad-4098-912d-6749b329824b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.549231 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dn4q\" (UniqueName: \"kubernetes.io/projected/c95486b5-f2ad-4098-912d-6749b329824b-kube-api-access-2dn4q\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.571202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c95486b5-f2ad-4098-912d-6749b329824b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.590356 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.664243 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.695333 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375fb8b7-b673-4fd7-ae51-5f82f33c196f" path="/var/lib/kubelet/pods/375fb8b7-b673-4fd7-ae51-5f82f33c196f/volumes" Mar 21 05:10:55 crc kubenswrapper[4775]: I0321 05:10:55.696516 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="839e915e-8197-48e9-8b69-56ac420a1eed" path="/var/lib/kubelet/pods/839e915e-8197-48e9-8b69-56ac420a1eed/volumes" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.058499 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-mmgp9"] Mar 21 05:10:56 crc kubenswrapper[4775]: W0321 05:10:56.065081 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4918607c_6074_4fb3_a0a0_8def479058a0.slice/crio-062d775e8dc7fe83d1d19047e5f49f7fb2f5f14974bc435df1465a3936dcebc8 WatchSource:0}: Error finding container 062d775e8dc7fe83d1d19047e5f49f7fb2f5f14974bc435df1465a3936dcebc8: Status 404 returned error can't find the container with id 062d775e8dc7fe83d1d19047e5f49f7fb2f5f14974bc435df1465a3936dcebc8 Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.170813 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" event={"ID":"4918607c-6074-4fb3-a0a0-8def479058a0","Type":"ContainerStarted","Data":"062d775e8dc7fe83d1d19047e5f49f7fb2f5f14974bc435df1465a3936dcebc8"} Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.170833 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.188498 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 05:10:56 crc kubenswrapper[4775]: W0321 05:10:56.190255 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc95486b5_f2ad_4098_912d_6749b329824b.slice/crio-1bb5e8dd2af95a8994651214c672edc0ac0ebfc669a4e75bf466f382e4a51488 WatchSource:0}: Error finding container 1bb5e8dd2af95a8994651214c672edc0ac0ebfc669a4e75bf466f382e4a51488: Status 404 returned error can't find the container with id 1bb5e8dd2af95a8994651214c672edc0ac0ebfc669a4e75bf466f382e4a51488 Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.324452 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.444611 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq54r\" (UniqueName: \"kubernetes.io/projected/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-kube-api-access-nq54r\") pod \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.444690 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-openstack-edpm-ipam\") pod \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.444760 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-svc\") pod \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.444796 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-sb\") pod \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.444854 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-swift-storage-0\") pod \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.444975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-nb\") pod \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.445049 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-config\") pod \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\" (UID: \"eb994900-6140-4a10-aafd-0e9a6d6dfcc9\") " Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.445918 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-config" (OuterVolumeSpecName: "config") pod "eb994900-6140-4a10-aafd-0e9a6d6dfcc9" (UID: "eb994900-6140-4a10-aafd-0e9a6d6dfcc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.446471 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "eb994900-6140-4a10-aafd-0e9a6d6dfcc9" (UID: "eb994900-6140-4a10-aafd-0e9a6d6dfcc9"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.446800 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eb994900-6140-4a10-aafd-0e9a6d6dfcc9" (UID: "eb994900-6140-4a10-aafd-0e9a6d6dfcc9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.447149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eb994900-6140-4a10-aafd-0e9a6d6dfcc9" (UID: "eb994900-6140-4a10-aafd-0e9a6d6dfcc9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.447523 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eb994900-6140-4a10-aafd-0e9a6d6dfcc9" (UID: "eb994900-6140-4a10-aafd-0e9a6d6dfcc9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.447897 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eb994900-6140-4a10-aafd-0e9a6d6dfcc9" (UID: "eb994900-6140-4a10-aafd-0e9a6d6dfcc9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.450486 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-kube-api-access-nq54r" (OuterVolumeSpecName: "kube-api-access-nq54r") pod "eb994900-6140-4a10-aafd-0e9a6d6dfcc9" (UID: "eb994900-6140-4a10-aafd-0e9a6d6dfcc9"). InnerVolumeSpecName "kube-api-access-nq54r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.547224 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.547262 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.547272 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.547280 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq54r\" (UniqueName: \"kubernetes.io/projected/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-kube-api-access-nq54r\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.547289 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.547296 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:56 crc kubenswrapper[4775]: I0321 05:10:56.547305 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eb994900-6140-4a10-aafd-0e9a6d6dfcc9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.184206 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5e83941-a38d-4ee9-b967-1dac69c5a55b","Type":"ContainerStarted","Data":"92cf1e237e929c497cbf8f575434685971036de8960a393e86900a1e6642f0ed"} Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.187571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c95486b5-f2ad-4098-912d-6749b329824b","Type":"ContainerStarted","Data":"1bb5e8dd2af95a8994651214c672edc0ac0ebfc669a4e75bf466f382e4a51488"} Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.189635 4775 generic.go:334] "Generic (PLEG): container finished" podID="4918607c-6074-4fb3-a0a0-8def479058a0" containerID="83fa5958fe9b584a13b72951e70ec159f4909e5520901e4dceebd69c63cb7ca9" exitCode=0 Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.189690 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-9m26s" Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.189703 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" event={"ID":"4918607c-6074-4fb3-a0a0-8def479058a0","Type":"ContainerDied","Data":"83fa5958fe9b584a13b72951e70ec159f4909e5520901e4dceebd69c63cb7ca9"} Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.420790 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-9m26s"] Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.431392 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-9m26s"] Mar 21 05:10:57 crc kubenswrapper[4775]: I0321 05:10:57.680084 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb994900-6140-4a10-aafd-0e9a6d6dfcc9" path="/var/lib/kubelet/pods/eb994900-6140-4a10-aafd-0e9a6d6dfcc9/volumes" Mar 21 05:10:58 crc kubenswrapper[4775]: I0321 05:10:58.208360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c95486b5-f2ad-4098-912d-6749b329824b","Type":"ContainerStarted","Data":"f565924e64e065b4ec412c0136fb28d6de7cffeff14b255d7fd47acd3851a689"} Mar 21 05:10:58 crc kubenswrapper[4775]: I0321 05:10:58.213284 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" event={"ID":"4918607c-6074-4fb3-a0a0-8def479058a0","Type":"ContainerStarted","Data":"32383fa63f060e6b7e3508ff1589d2a5f19f1eb11d6ebff99c27ff830eb0347f"} Mar 21 05:10:58 crc kubenswrapper[4775]: I0321 05:10:58.213492 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:10:58 crc kubenswrapper[4775]: I0321 05:10:58.249899 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" podStartSLOduration=3.249882356 podStartE2EDuration="3.249882356s" podCreationTimestamp="2026-03-21 05:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:10:58.24932509 +0000 UTC m=+1411.225788714" watchObservedRunningTime="2026-03-21 05:10:58.249882356 +0000 UTC m=+1411.226345980" Mar 21 05:11:02 crc kubenswrapper[4775]: I0321 05:11:02.482234 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:11:02 crc kubenswrapper[4775]: I0321 05:11:02.482525 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:11:02 crc kubenswrapper[4775]: I0321 05:11:02.482578 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:11:02 crc kubenswrapper[4775]: I0321 05:11:02.483382 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6e85a7c4acc97394b06df812a396284f471ec9c7f9eee22918e9da54e21feda"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:11:02 crc kubenswrapper[4775]: I0321 05:11:02.483454 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://b6e85a7c4acc97394b06df812a396284f471ec9c7f9eee22918e9da54e21feda" gracePeriod=600 Mar 21 05:11:03 crc kubenswrapper[4775]: I0321 05:11:03.256612 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="b6e85a7c4acc97394b06df812a396284f471ec9c7f9eee22918e9da54e21feda" exitCode=0 Mar 21 05:11:03 crc kubenswrapper[4775]: I0321 05:11:03.256652 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"b6e85a7c4acc97394b06df812a396284f471ec9c7f9eee22918e9da54e21feda"} Mar 21 05:11:03 crc kubenswrapper[4775]: I0321 05:11:03.256937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4"} Mar 21 05:11:03 crc kubenswrapper[4775]: I0321 05:11:03.256959 4775 scope.go:117] "RemoveContainer" containerID="09c71a6e96dc622a58adf5b83a67eab26ff45301d2bfcc2a43f1cbd9eb2d9791" Mar 21 05:11:05 crc kubenswrapper[4775]: I0321 05:11:05.593235 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-mmgp9" Mar 21 05:11:05 crc kubenswrapper[4775]: I0321 05:11:05.691682 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-rtnbf"] Mar 21 05:11:05 crc kubenswrapper[4775]: I0321 05:11:05.691945 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" podUID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerName="dnsmasq-dns" containerID="cri-o://90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325" gracePeriod=10 Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.245910 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.289076 4775 generic.go:334] "Generic (PLEG): container finished" podID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerID="90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325" exitCode=0 Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.289141 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" event={"ID":"d725aabc-ba32-4c0e-bc91-8819d73cae40","Type":"ContainerDied","Data":"90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325"} Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.289180 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" event={"ID":"d725aabc-ba32-4c0e-bc91-8819d73cae40","Type":"ContainerDied","Data":"5dc8047188086fad5c8d0b1fabab1913e5036d9e22f49d173fa13d6dd588e3c0"} Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.289188 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-rtnbf" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.289202 4775 scope.go:117] "RemoveContainer" containerID="90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.324336 4775 scope.go:117] "RemoveContainer" containerID="e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.339060 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-swift-storage-0\") pod \"d725aabc-ba32-4c0e-bc91-8819d73cae40\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.339192 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl7xb\" (UniqueName: \"kubernetes.io/projected/d725aabc-ba32-4c0e-bc91-8819d73cae40-kube-api-access-dl7xb\") pod \"d725aabc-ba32-4c0e-bc91-8819d73cae40\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.339260 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-nb\") pod \"d725aabc-ba32-4c0e-bc91-8819d73cae40\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.339337 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-svc\") pod \"d725aabc-ba32-4c0e-bc91-8819d73cae40\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.339385 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-sb\") pod \"d725aabc-ba32-4c0e-bc91-8819d73cae40\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.339434 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-config\") pod \"d725aabc-ba32-4c0e-bc91-8819d73cae40\" (UID: \"d725aabc-ba32-4c0e-bc91-8819d73cae40\") " Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.366531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d725aabc-ba32-4c0e-bc91-8819d73cae40-kube-api-access-dl7xb" (OuterVolumeSpecName: "kube-api-access-dl7xb") pod "d725aabc-ba32-4c0e-bc91-8819d73cae40" (UID: "d725aabc-ba32-4c0e-bc91-8819d73cae40"). InnerVolumeSpecName "kube-api-access-dl7xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.398802 4775 scope.go:117] "RemoveContainer" containerID="90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325" Mar 21 05:11:06 crc kubenswrapper[4775]: E0321 05:11:06.399291 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325\": container with ID starting with 90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325 not found: ID does not exist" containerID="90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.399322 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325"} err="failed to get container status \"90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325\": rpc error: code = NotFound desc = could not find container \"90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325\": container with ID starting with 90f0ef44d50d3f68a6e80b10a816b133ae6b897ca8881964ca82e79aca5bf325 not found: ID does not exist" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.399350 4775 scope.go:117] "RemoveContainer" containerID="e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974" Mar 21 05:11:06 crc kubenswrapper[4775]: E0321 05:11:06.400700 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974\": container with ID starting with e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974 not found: ID does not exist" containerID="e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.400776 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974"} err="failed to get container status \"e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974\": rpc error: code = NotFound desc = could not find container \"e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974\": container with ID starting with e55382d5f315dca58dbeb0320372028174ccdbb56163037ad99b712395ff7974 not found: ID does not exist" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.426378 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d725aabc-ba32-4c0e-bc91-8819d73cae40" (UID: "d725aabc-ba32-4c0e-bc91-8819d73cae40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.435379 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d725aabc-ba32-4c0e-bc91-8819d73cae40" (UID: "d725aabc-ba32-4c0e-bc91-8819d73cae40"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.441597 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-config" (OuterVolumeSpecName: "config") pod "d725aabc-ba32-4c0e-bc91-8819d73cae40" (UID: "d725aabc-ba32-4c0e-bc91-8819d73cae40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.444327 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d725aabc-ba32-4c0e-bc91-8819d73cae40" (UID: "d725aabc-ba32-4c0e-bc91-8819d73cae40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.445517 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.445538 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl7xb\" (UniqueName: \"kubernetes.io/projected/d725aabc-ba32-4c0e-bc91-8819d73cae40-kube-api-access-dl7xb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.445548 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.445558 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.445567 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.450789 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d725aabc-ba32-4c0e-bc91-8819d73cae40" (UID: "d725aabc-ba32-4c0e-bc91-8819d73cae40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.546763 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d725aabc-ba32-4c0e-bc91-8819d73cae40-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.622520 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-rtnbf"] Mar 21 05:11:06 crc kubenswrapper[4775]: I0321 05:11:06.630530 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-rtnbf"] Mar 21 05:11:07 crc kubenswrapper[4775]: I0321 05:11:07.673475 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d725aabc-ba32-4c0e-bc91-8819d73cae40" path="/var/lib/kubelet/pods/d725aabc-ba32-4c0e-bc91-8819d73cae40/volumes" Mar 21 05:11:09 crc kubenswrapper[4775]: I0321 05:11:09.369685 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.125779 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m"] Mar 21 05:11:14 crc kubenswrapper[4775]: E0321 05:11:14.127530 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerName="dnsmasq-dns" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.127553 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerName="dnsmasq-dns" Mar 21 05:11:14 crc kubenswrapper[4775]: E0321 05:11:14.127624 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerName="init" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.127639 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerName="init" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.127928 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d725aabc-ba32-4c0e-bc91-8819d73cae40" containerName="dnsmasq-dns" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.129216 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.133252 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.133574 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.133765 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.133928 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.153167 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m"] Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.288400 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7mq\" (UniqueName: \"kubernetes.io/projected/f41367b2-433d-48f7-af75-575be4b318fc-kube-api-access-sc7mq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.288478 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.288520 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.288640 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.389842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.389928 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7mq\" (UniqueName: \"kubernetes.io/projected/f41367b2-433d-48f7-af75-575be4b318fc-kube-api-access-sc7mq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.389965 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.389998 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.395932 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.396146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.399976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.414577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7mq\" (UniqueName: \"kubernetes.io/projected/f41367b2-433d-48f7-af75-575be4b318fc-kube-api-access-sc7mq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:14 crc kubenswrapper[4775]: I0321 05:11:14.483563 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:15 crc kubenswrapper[4775]: I0321 05:11:15.293959 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m"] Mar 21 05:11:15 crc kubenswrapper[4775]: W0321 05:11:15.298523 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf41367b2_433d_48f7_af75_575be4b318fc.slice/crio-fc1e18fbb56f39a46d6d2570a607e697306df431bf99ea7b40225a84d7c29234 WatchSource:0}: Error finding container fc1e18fbb56f39a46d6d2570a607e697306df431bf99ea7b40225a84d7c29234: Status 404 returned error can't find the container with id fc1e18fbb56f39a46d6d2570a607e697306df431bf99ea7b40225a84d7c29234 Mar 21 05:11:15 crc kubenswrapper[4775]: I0321 05:11:15.371470 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" event={"ID":"f41367b2-433d-48f7-af75-575be4b318fc","Type":"ContainerStarted","Data":"fc1e18fbb56f39a46d6d2570a607e697306df431bf99ea7b40225a84d7c29234"} Mar 21 05:11:25 crc kubenswrapper[4775]: I0321 05:11:25.476785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" event={"ID":"f41367b2-433d-48f7-af75-575be4b318fc","Type":"ContainerStarted","Data":"fedd5444d5690bdee4579c2ab7d7bd0312f0ec5b6822006b2d668946f12b04e5"} Mar 21 05:11:25 crc kubenswrapper[4775]: I0321 05:11:25.498554 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" podStartSLOduration=2.328201561 podStartE2EDuration="11.498520349s" podCreationTimestamp="2026-03-21 05:11:14 +0000 UTC" firstStartedPulling="2026-03-21 05:11:15.300219168 +0000 UTC m=+1428.276682792" lastFinishedPulling="2026-03-21 05:11:24.470537956 +0000 UTC m=+1437.447001580" observedRunningTime="2026-03-21 05:11:25.49606445 +0000 UTC m=+1438.472528074" watchObservedRunningTime="2026-03-21 05:11:25.498520349 +0000 UTC m=+1438.474983983" Mar 21 05:11:29 crc kubenswrapper[4775]: I0321 05:11:29.514696 4775 generic.go:334] "Generic (PLEG): container finished" podID="e5e83941-a38d-4ee9-b967-1dac69c5a55b" containerID="92cf1e237e929c497cbf8f575434685971036de8960a393e86900a1e6642f0ed" exitCode=0 Mar 21 05:11:29 crc kubenswrapper[4775]: I0321 05:11:29.514775 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5e83941-a38d-4ee9-b967-1dac69c5a55b","Type":"ContainerDied","Data":"92cf1e237e929c497cbf8f575434685971036de8960a393e86900a1e6642f0ed"} Mar 21 05:11:30 crc kubenswrapper[4775]: I0321 05:11:30.527698 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5e83941-a38d-4ee9-b967-1dac69c5a55b","Type":"ContainerStarted","Data":"6bacd31d0f13fb2a0595ae6cad760ba6c31c94539fc57bb47ede52a58f85d70a"} Mar 21 05:11:30 crc kubenswrapper[4775]: I0321 05:11:30.528316 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 05:11:30 crc kubenswrapper[4775]: I0321 05:11:30.529886 4775 generic.go:334] "Generic (PLEG): container finished" podID="c95486b5-f2ad-4098-912d-6749b329824b" containerID="f565924e64e065b4ec412c0136fb28d6de7cffeff14b255d7fd47acd3851a689" exitCode=0 Mar 21 05:11:30 crc kubenswrapper[4775]: I0321 05:11:30.529923 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c95486b5-f2ad-4098-912d-6749b329824b","Type":"ContainerDied","Data":"f565924e64e065b4ec412c0136fb28d6de7cffeff14b255d7fd47acd3851a689"} Mar 21 05:11:30 crc kubenswrapper[4775]: I0321 05:11:30.558333 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.558311213 podStartE2EDuration="36.558311213s" podCreationTimestamp="2026-03-21 05:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:30.551281145 +0000 UTC m=+1443.527744769" watchObservedRunningTime="2026-03-21 05:11:30.558311213 +0000 UTC m=+1443.534774847" Mar 21 05:11:31 crc kubenswrapper[4775]: I0321 05:11:31.542523 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c95486b5-f2ad-4098-912d-6749b329824b","Type":"ContainerStarted","Data":"b46026fb47cd615b91a045db81faedfd96537575f80e33a593af4cbe28d57fe6"} Mar 21 05:11:31 crc kubenswrapper[4775]: I0321 05:11:31.543425 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:11:31 crc kubenswrapper[4775]: I0321 05:11:31.567210 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.567189829 podStartE2EDuration="36.567189829s" podCreationTimestamp="2026-03-21 05:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:11:31.560594974 +0000 UTC m=+1444.537058608" watchObservedRunningTime="2026-03-21 05:11:31.567189829 +0000 UTC m=+1444.543653463" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.406908 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z6mp2"] Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.409766 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.430928 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6mp2"] Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.458220 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-catalog-content\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.458423 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdsjf\" (UniqueName: \"kubernetes.io/projected/2bddbf00-7ac4-4180-b958-c377ef640334-kube-api-access-pdsjf\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.458467 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-utilities\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.561547 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdsjf\" (UniqueName: \"kubernetes.io/projected/2bddbf00-7ac4-4180-b958-c377ef640334-kube-api-access-pdsjf\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.561627 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-utilities\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.561681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-catalog-content\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.562254 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-catalog-content\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.562886 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-utilities\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.606497 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdsjf\" (UniqueName: \"kubernetes.io/projected/2bddbf00-7ac4-4180-b958-c377ef640334-kube-api-access-pdsjf\") pod \"redhat-operators-z6mp2\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:35 crc kubenswrapper[4775]: I0321 05:11:35.733672 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:36 crc kubenswrapper[4775]: I0321 05:11:36.204584 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6mp2"] Mar 21 05:11:36 crc kubenswrapper[4775]: I0321 05:11:36.600126 4775 generic.go:334] "Generic (PLEG): container finished" podID="2bddbf00-7ac4-4180-b958-c377ef640334" containerID="86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e" exitCode=0 Mar 21 05:11:36 crc kubenswrapper[4775]: I0321 05:11:36.600200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mp2" event={"ID":"2bddbf00-7ac4-4180-b958-c377ef640334","Type":"ContainerDied","Data":"86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e"} Mar 21 05:11:36 crc kubenswrapper[4775]: I0321 05:11:36.600230 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mp2" event={"ID":"2bddbf00-7ac4-4180-b958-c377ef640334","Type":"ContainerStarted","Data":"2332fb4c24ce5b7478674331300ff4e968647562e22dc21f81edba2ae13017f2"} Mar 21 05:11:36 crc kubenswrapper[4775]: I0321 05:11:36.601932 4775 generic.go:334] "Generic (PLEG): container finished" podID="f41367b2-433d-48f7-af75-575be4b318fc" containerID="fedd5444d5690bdee4579c2ab7d7bd0312f0ec5b6822006b2d668946f12b04e5" exitCode=0 Mar 21 05:11:36 crc kubenswrapper[4775]: I0321 05:11:36.601975 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" event={"ID":"f41367b2-433d-48f7-af75-575be4b318fc","Type":"ContainerDied","Data":"fedd5444d5690bdee4579c2ab7d7bd0312f0ec5b6822006b2d668946f12b04e5"} Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.050407 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.209041 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-ssh-key-openstack-edpm-ipam\") pod \"f41367b2-433d-48f7-af75-575be4b318fc\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.209158 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-inventory\") pod \"f41367b2-433d-48f7-af75-575be4b318fc\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.209201 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-repo-setup-combined-ca-bundle\") pod \"f41367b2-433d-48f7-af75-575be4b318fc\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.209280 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc7mq\" (UniqueName: \"kubernetes.io/projected/f41367b2-433d-48f7-af75-575be4b318fc-kube-api-access-sc7mq\") pod \"f41367b2-433d-48f7-af75-575be4b318fc\" (UID: \"f41367b2-433d-48f7-af75-575be4b318fc\") " Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.226892 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f41367b2-433d-48f7-af75-575be4b318fc" (UID: "f41367b2-433d-48f7-af75-575be4b318fc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.230060 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f41367b2-433d-48f7-af75-575be4b318fc-kube-api-access-sc7mq" (OuterVolumeSpecName: "kube-api-access-sc7mq") pod "f41367b2-433d-48f7-af75-575be4b318fc" (UID: "f41367b2-433d-48f7-af75-575be4b318fc"). InnerVolumeSpecName "kube-api-access-sc7mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.242080 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-inventory" (OuterVolumeSpecName: "inventory") pod "f41367b2-433d-48f7-af75-575be4b318fc" (UID: "f41367b2-433d-48f7-af75-575be4b318fc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.248656 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f41367b2-433d-48f7-af75-575be4b318fc" (UID: "f41367b2-433d-48f7-af75-575be4b318fc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.311302 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.311350 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.311364 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f41367b2-433d-48f7-af75-575be4b318fc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.311378 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc7mq\" (UniqueName: \"kubernetes.io/projected/f41367b2-433d-48f7-af75-575be4b318fc-kube-api-access-sc7mq\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.621196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mp2" event={"ID":"2bddbf00-7ac4-4180-b958-c377ef640334","Type":"ContainerStarted","Data":"04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd"} Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.624747 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" event={"ID":"f41367b2-433d-48f7-af75-575be4b318fc","Type":"ContainerDied","Data":"fc1e18fbb56f39a46d6d2570a607e697306df431bf99ea7b40225a84d7c29234"} Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.624784 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc1e18fbb56f39a46d6d2570a607e697306df431bf99ea7b40225a84d7c29234" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.624826 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.729517 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc"] Mar 21 05:11:38 crc kubenswrapper[4775]: E0321 05:11:38.729967 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f41367b2-433d-48f7-af75-575be4b318fc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.729992 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f41367b2-433d-48f7-af75-575be4b318fc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.730208 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f41367b2-433d-48f7-af75-575be4b318fc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.730823 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.734794 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.734869 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.735010 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.735057 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.745602 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc"] Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.823465 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.823682 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.823895 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dhq4\" (UniqueName: \"kubernetes.io/projected/7c88e417-9ede-41d8-8337-79620ceb7798-kube-api-access-7dhq4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.926166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dhq4\" (UniqueName: \"kubernetes.io/projected/7c88e417-9ede-41d8-8337-79620ceb7798-kube-api-access-7dhq4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.926230 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.927026 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.931206 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.931515 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:38 crc kubenswrapper[4775]: I0321 05:11:38.942629 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dhq4\" (UniqueName: \"kubernetes.io/projected/7c88e417-9ede-41d8-8337-79620ceb7798-kube-api-access-7dhq4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qgpzc\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:39 crc kubenswrapper[4775]: I0321 05:11:39.068607 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:39 crc kubenswrapper[4775]: I0321 05:11:39.587255 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc"] Mar 21 05:11:39 crc kubenswrapper[4775]: I0321 05:11:39.637284 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" event={"ID":"7c88e417-9ede-41d8-8337-79620ceb7798","Type":"ContainerStarted","Data":"78ac04b1c27fe6ac462f433d708306d43b03e18067a1c937699c9d1ee163b33a"} Mar 21 05:11:41 crc kubenswrapper[4775]: I0321 05:11:41.659232 4775 generic.go:334] "Generic (PLEG): container finished" podID="2bddbf00-7ac4-4180-b958-c377ef640334" containerID="04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd" exitCode=0 Mar 21 05:11:41 crc kubenswrapper[4775]: I0321 05:11:41.659278 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mp2" event={"ID":"2bddbf00-7ac4-4180-b958-c377ef640334","Type":"ContainerDied","Data":"04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd"} Mar 21 05:11:42 crc kubenswrapper[4775]: I0321 05:11:42.787027 4775 scope.go:117] "RemoveContainer" containerID="9a3778eec58b097b77fc9e80249e34ac71c57ec91481179476b24785158c24e3" Mar 21 05:11:43 crc kubenswrapper[4775]: I0321 05:11:43.473161 4775 scope.go:117] "RemoveContainer" containerID="f749dd6bac424eb0695958e8b0d01fc15e3e824b72ea62322c8b3aee65b4d200" Mar 21 05:11:44 crc kubenswrapper[4775]: I0321 05:11:44.574715 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e5e83941-a38d-4ee9-b967-1dac69c5a55b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.211:5671: connect: connection refused" Mar 21 05:11:44 crc kubenswrapper[4775]: I0321 05:11:44.690890 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" event={"ID":"7c88e417-9ede-41d8-8337-79620ceb7798","Type":"ContainerStarted","Data":"e57bfbd0e9b0250d0a5dde317af9eac11f2c7f53c320d67ebd8107bb4429316b"} Mar 21 05:11:44 crc kubenswrapper[4775]: I0321 05:11:44.694910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mp2" event={"ID":"2bddbf00-7ac4-4180-b958-c377ef640334","Type":"ContainerStarted","Data":"c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341"} Mar 21 05:11:44 crc kubenswrapper[4775]: I0321 05:11:44.717356 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" podStartSLOduration=2.476692311 podStartE2EDuration="6.717331494s" podCreationTimestamp="2026-03-21 05:11:38 +0000 UTC" firstStartedPulling="2026-03-21 05:11:39.596070292 +0000 UTC m=+1452.572533936" lastFinishedPulling="2026-03-21 05:11:43.836709485 +0000 UTC m=+1456.813173119" observedRunningTime="2026-03-21 05:11:44.711556362 +0000 UTC m=+1457.688020006" watchObservedRunningTime="2026-03-21 05:11:44.717331494 +0000 UTC m=+1457.693795118" Mar 21 05:11:44 crc kubenswrapper[4775]: I0321 05:11:44.741834 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z6mp2" podStartSLOduration=2.087083223 podStartE2EDuration="9.741813703s" podCreationTimestamp="2026-03-21 05:11:35 +0000 UTC" firstStartedPulling="2026-03-21 05:11:36.601536196 +0000 UTC m=+1449.577999820" lastFinishedPulling="2026-03-21 05:11:44.256266676 +0000 UTC m=+1457.232730300" observedRunningTime="2026-03-21 05:11:44.739473847 +0000 UTC m=+1457.715937471" watchObservedRunningTime="2026-03-21 05:11:44.741813703 +0000 UTC m=+1457.718277347" Mar 21 05:11:45 crc kubenswrapper[4775]: I0321 05:11:45.672170 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 05:11:45 crc kubenswrapper[4775]: I0321 05:11:45.734495 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:45 crc kubenswrapper[4775]: I0321 05:11:45.734744 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:46 crc kubenswrapper[4775]: I0321 05:11:46.715707 4775 generic.go:334] "Generic (PLEG): container finished" podID="7c88e417-9ede-41d8-8337-79620ceb7798" containerID="e57bfbd0e9b0250d0a5dde317af9eac11f2c7f53c320d67ebd8107bb4429316b" exitCode=0 Mar 21 05:11:46 crc kubenswrapper[4775]: I0321 05:11:46.715798 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" event={"ID":"7c88e417-9ede-41d8-8337-79620ceb7798","Type":"ContainerDied","Data":"e57bfbd0e9b0250d0a5dde317af9eac11f2c7f53c320d67ebd8107bb4429316b"} Mar 21 05:11:46 crc kubenswrapper[4775]: I0321 05:11:46.804786 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z6mp2" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="registry-server" probeResult="failure" output=< Mar 21 05:11:46 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:11:46 crc kubenswrapper[4775]: > Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.179238 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.308566 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-inventory\") pod \"7c88e417-9ede-41d8-8337-79620ceb7798\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.308621 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dhq4\" (UniqueName: \"kubernetes.io/projected/7c88e417-9ede-41d8-8337-79620ceb7798-kube-api-access-7dhq4\") pod \"7c88e417-9ede-41d8-8337-79620ceb7798\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.308766 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-ssh-key-openstack-edpm-ipam\") pod \"7c88e417-9ede-41d8-8337-79620ceb7798\" (UID: \"7c88e417-9ede-41d8-8337-79620ceb7798\") " Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.315582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c88e417-9ede-41d8-8337-79620ceb7798-kube-api-access-7dhq4" (OuterVolumeSpecName: "kube-api-access-7dhq4") pod "7c88e417-9ede-41d8-8337-79620ceb7798" (UID: "7c88e417-9ede-41d8-8337-79620ceb7798"). InnerVolumeSpecName "kube-api-access-7dhq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.342287 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-inventory" (OuterVolumeSpecName: "inventory") pod "7c88e417-9ede-41d8-8337-79620ceb7798" (UID: "7c88e417-9ede-41d8-8337-79620ceb7798"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.346828 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7c88e417-9ede-41d8-8337-79620ceb7798" (UID: "7c88e417-9ede-41d8-8337-79620ceb7798"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.410681 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.410714 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c88e417-9ede-41d8-8337-79620ceb7798-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.410726 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dhq4\" (UniqueName: \"kubernetes.io/projected/7c88e417-9ede-41d8-8337-79620ceb7798-kube-api-access-7dhq4\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.736529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" event={"ID":"7c88e417-9ede-41d8-8337-79620ceb7798","Type":"ContainerDied","Data":"78ac04b1c27fe6ac462f433d708306d43b03e18067a1c937699c9d1ee163b33a"} Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.736957 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ac04b1c27fe6ac462f433d708306d43b03e18067a1c937699c9d1ee163b33a" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.736788 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qgpzc" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.834957 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m"] Mar 21 05:11:48 crc kubenswrapper[4775]: E0321 05:11:48.835345 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c88e417-9ede-41d8-8337-79620ceb7798" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.835362 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c88e417-9ede-41d8-8337-79620ceb7798" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.835546 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c88e417-9ede-41d8-8337-79620ceb7798" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.836203 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.840515 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.844829 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.845364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.848567 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m"] Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.854963 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.923815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.924168 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.924290 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:48 crc kubenswrapper[4775]: I0321 05:11:48.924361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfxh\" (UniqueName: \"kubernetes.io/projected/203df932-0574-4098-b897-ba50813f2ec1-kube-api-access-5sfxh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.025958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.027153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.027305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.027401 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfxh\" (UniqueName: \"kubernetes.io/projected/203df932-0574-4098-b897-ba50813f2ec1-kube-api-access-5sfxh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.031198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.031321 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.038552 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.075813 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfxh\" (UniqueName: \"kubernetes.io/projected/203df932-0574-4098-b897-ba50813f2ec1-kube-api-access-5sfxh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.156217 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.701361 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m"] Mar 21 05:11:49 crc kubenswrapper[4775]: I0321 05:11:49.750876 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" event={"ID":"203df932-0574-4098-b897-ba50813f2ec1","Type":"ContainerStarted","Data":"3c801bc6a121fd4faebee72f398b0bcdeabcebcace5ab4c44442f937b77950dc"} Mar 21 05:11:50 crc kubenswrapper[4775]: I0321 05:11:50.760485 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" event={"ID":"203df932-0574-4098-b897-ba50813f2ec1","Type":"ContainerStarted","Data":"e83db7ccdc230f6a7c372f2ab3e28ef862efd0646a039a1d8f53714e19540048"} Mar 21 05:11:50 crc kubenswrapper[4775]: I0321 05:11:50.784521 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" podStartSLOduration=2.330224394 podStartE2EDuration="2.784497711s" podCreationTimestamp="2026-03-21 05:11:48 +0000 UTC" firstStartedPulling="2026-03-21 05:11:49.703557788 +0000 UTC m=+1462.680021412" lastFinishedPulling="2026-03-21 05:11:50.157831105 +0000 UTC m=+1463.134294729" observedRunningTime="2026-03-21 05:11:50.780098807 +0000 UTC m=+1463.756562431" watchObservedRunningTime="2026-03-21 05:11:50.784497711 +0000 UTC m=+1463.760961335" Mar 21 05:11:54 crc kubenswrapper[4775]: I0321 05:11:54.577946 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 05:11:55 crc kubenswrapper[4775]: I0321 05:11:55.796019 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:55 crc kubenswrapper[4775]: I0321 05:11:55.850330 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:56 crc kubenswrapper[4775]: I0321 05:11:56.029149 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6mp2"] Mar 21 05:11:56 crc kubenswrapper[4775]: I0321 05:11:56.836268 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z6mp2" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="registry-server" containerID="cri-o://c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341" gracePeriod=2 Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.250075 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.401667 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-catalog-content\") pod \"2bddbf00-7ac4-4180-b958-c377ef640334\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.401804 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-utilities\") pod \"2bddbf00-7ac4-4180-b958-c377ef640334\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.402005 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdsjf\" (UniqueName: \"kubernetes.io/projected/2bddbf00-7ac4-4180-b958-c377ef640334-kube-api-access-pdsjf\") pod \"2bddbf00-7ac4-4180-b958-c377ef640334\" (UID: \"2bddbf00-7ac4-4180-b958-c377ef640334\") " Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.402754 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-utilities" (OuterVolumeSpecName: "utilities") pod "2bddbf00-7ac4-4180-b958-c377ef640334" (UID: "2bddbf00-7ac4-4180-b958-c377ef640334"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.408203 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bddbf00-7ac4-4180-b958-c377ef640334-kube-api-access-pdsjf" (OuterVolumeSpecName: "kube-api-access-pdsjf") pod "2bddbf00-7ac4-4180-b958-c377ef640334" (UID: "2bddbf00-7ac4-4180-b958-c377ef640334"). InnerVolumeSpecName "kube-api-access-pdsjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.504278 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdsjf\" (UniqueName: \"kubernetes.io/projected/2bddbf00-7ac4-4180-b958-c377ef640334-kube-api-access-pdsjf\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.504319 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.525417 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bddbf00-7ac4-4180-b958-c377ef640334" (UID: "2bddbf00-7ac4-4180-b958-c377ef640334"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.606790 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bddbf00-7ac4-4180-b958-c377ef640334-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.860006 4775 generic.go:334] "Generic (PLEG): container finished" podID="2bddbf00-7ac4-4180-b958-c377ef640334" containerID="c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341" exitCode=0 Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.860463 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mp2" event={"ID":"2bddbf00-7ac4-4180-b958-c377ef640334","Type":"ContainerDied","Data":"c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341"} Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.860509 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6mp2" event={"ID":"2bddbf00-7ac4-4180-b958-c377ef640334","Type":"ContainerDied","Data":"2332fb4c24ce5b7478674331300ff4e968647562e22dc21f81edba2ae13017f2"} Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.860555 4775 scope.go:117] "RemoveContainer" containerID="c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.860879 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6mp2" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.903768 4775 scope.go:117] "RemoveContainer" containerID="04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.903922 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6mp2"] Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.911020 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z6mp2"] Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.937099 4775 scope.go:117] "RemoveContainer" containerID="86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.975872 4775 scope.go:117] "RemoveContainer" containerID="c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341" Mar 21 05:11:57 crc kubenswrapper[4775]: E0321 05:11:57.976230 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341\": container with ID starting with c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341 not found: ID does not exist" containerID="c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.976267 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341"} err="failed to get container status \"c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341\": rpc error: code = NotFound desc = could not find container \"c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341\": container with ID starting with c9ab0fd1a0cc329833eab925ace21a193f7a42c34989b229a72bfaf73bdc5341 not found: ID does not exist" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.976288 4775 scope.go:117] "RemoveContainer" containerID="04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd" Mar 21 05:11:57 crc kubenswrapper[4775]: E0321 05:11:57.976555 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd\": container with ID starting with 04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd not found: ID does not exist" containerID="04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.976581 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd"} err="failed to get container status \"04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd\": rpc error: code = NotFound desc = could not find container \"04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd\": container with ID starting with 04d5d232cc45dcabc0c4f738da19b014f8c8b51d249691a8d9e6bed3ed0986cd not found: ID does not exist" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.976595 4775 scope.go:117] "RemoveContainer" containerID="86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e" Mar 21 05:11:57 crc kubenswrapper[4775]: E0321 05:11:57.976831 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e\": container with ID starting with 86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e not found: ID does not exist" containerID="86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e" Mar 21 05:11:57 crc kubenswrapper[4775]: I0321 05:11:57.976885 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e"} err="failed to get container status \"86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e\": rpc error: code = NotFound desc = could not find container \"86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e\": container with ID starting with 86bb0734a78e9ce15241affa8ae7b160b9d5b0cb227f27eb1579587505a0ae1e not found: ID does not exist" Mar 21 05:11:59 crc kubenswrapper[4775]: I0321 05:11:59.672236 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" path="/var/lib/kubelet/pods/2bddbf00-7ac4-4180-b958-c377ef640334/volumes" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.146851 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567832-cm8v6"] Mar 21 05:12:00 crc kubenswrapper[4775]: E0321 05:12:00.147510 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.147544 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4775]: E0321 05:12:00.147603 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="extract-content" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.147617 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="extract-content" Mar 21 05:12:00 crc kubenswrapper[4775]: E0321 05:12:00.147644 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="extract-utilities" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.147659 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="extract-utilities" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.147980 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bddbf00-7ac4-4180-b958-c377ef640334" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.149044 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-cm8v6" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.151037 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.151561 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.151788 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.172434 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-cm8v6"] Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.257699 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrhr8\" (UniqueName: \"kubernetes.io/projected/0ffbca22-6533-4ade-8f1f-eeeda82f159f-kube-api-access-lrhr8\") pod \"auto-csr-approver-29567832-cm8v6\" (UID: \"0ffbca22-6533-4ade-8f1f-eeeda82f159f\") " pod="openshift-infra/auto-csr-approver-29567832-cm8v6" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.362241 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrhr8\" (UniqueName: \"kubernetes.io/projected/0ffbca22-6533-4ade-8f1f-eeeda82f159f-kube-api-access-lrhr8\") pod \"auto-csr-approver-29567832-cm8v6\" (UID: \"0ffbca22-6533-4ade-8f1f-eeeda82f159f\") " pod="openshift-infra/auto-csr-approver-29567832-cm8v6" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.387144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrhr8\" (UniqueName: \"kubernetes.io/projected/0ffbca22-6533-4ade-8f1f-eeeda82f159f-kube-api-access-lrhr8\") pod \"auto-csr-approver-29567832-cm8v6\" (UID: \"0ffbca22-6533-4ade-8f1f-eeeda82f159f\") " pod="openshift-infra/auto-csr-approver-29567832-cm8v6" Mar 21 05:12:00 crc kubenswrapper[4775]: I0321 05:12:00.472173 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-cm8v6" Mar 21 05:12:01 crc kubenswrapper[4775]: I0321 05:12:01.163202 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-cm8v6"] Mar 21 05:12:01 crc kubenswrapper[4775]: I0321 05:12:01.904758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-cm8v6" event={"ID":"0ffbca22-6533-4ade-8f1f-eeeda82f159f","Type":"ContainerStarted","Data":"613e7e7dde3564958173ee27dab8808a83afb18788c0f6e13f0edfacec7337ef"} Mar 21 05:12:02 crc kubenswrapper[4775]: I0321 05:12:02.914421 4775 generic.go:334] "Generic (PLEG): container finished" podID="0ffbca22-6533-4ade-8f1f-eeeda82f159f" containerID="fdab6584b0a9bddc76fc370c865954a1758e24e006007d8b2953684194b46540" exitCode=0 Mar 21 05:12:02 crc kubenswrapper[4775]: I0321 05:12:02.914460 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-cm8v6" event={"ID":"0ffbca22-6533-4ade-8f1f-eeeda82f159f","Type":"ContainerDied","Data":"fdab6584b0a9bddc76fc370c865954a1758e24e006007d8b2953684194b46540"} Mar 21 05:12:04 crc kubenswrapper[4775]: I0321 05:12:04.319930 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-cm8v6" Mar 21 05:12:04 crc kubenswrapper[4775]: I0321 05:12:04.388013 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrhr8\" (UniqueName: \"kubernetes.io/projected/0ffbca22-6533-4ade-8f1f-eeeda82f159f-kube-api-access-lrhr8\") pod \"0ffbca22-6533-4ade-8f1f-eeeda82f159f\" (UID: \"0ffbca22-6533-4ade-8f1f-eeeda82f159f\") " Mar 21 05:12:04 crc kubenswrapper[4775]: I0321 05:12:04.396756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffbca22-6533-4ade-8f1f-eeeda82f159f-kube-api-access-lrhr8" (OuterVolumeSpecName: "kube-api-access-lrhr8") pod "0ffbca22-6533-4ade-8f1f-eeeda82f159f" (UID: "0ffbca22-6533-4ade-8f1f-eeeda82f159f"). InnerVolumeSpecName "kube-api-access-lrhr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:04 crc kubenswrapper[4775]: I0321 05:12:04.490264 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrhr8\" (UniqueName: \"kubernetes.io/projected/0ffbca22-6533-4ade-8f1f-eeeda82f159f-kube-api-access-lrhr8\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:04 crc kubenswrapper[4775]: I0321 05:12:04.933907 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-cm8v6" event={"ID":"0ffbca22-6533-4ade-8f1f-eeeda82f159f","Type":"ContainerDied","Data":"613e7e7dde3564958173ee27dab8808a83afb18788c0f6e13f0edfacec7337ef"} Mar 21 05:12:04 crc kubenswrapper[4775]: I0321 05:12:04.933952 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="613e7e7dde3564958173ee27dab8808a83afb18788c0f6e13f0edfacec7337ef" Mar 21 05:12:04 crc kubenswrapper[4775]: I0321 05:12:04.933955 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-cm8v6" Mar 21 05:12:05 crc kubenswrapper[4775]: I0321 05:12:05.390063 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-89d6v"] Mar 21 05:12:05 crc kubenswrapper[4775]: I0321 05:12:05.400863 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-89d6v"] Mar 21 05:12:05 crc kubenswrapper[4775]: I0321 05:12:05.686502 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c64d94e-917c-49b6-824b-0b2bdf9691ef" path="/var/lib/kubelet/pods/0c64d94e-917c-49b6-824b-0b2bdf9691ef/volumes" Mar 21 05:12:43 crc kubenswrapper[4775]: I0321 05:12:43.974976 4775 scope.go:117] "RemoveContainer" containerID="af6715536a4cd1315ee5f22687cb60f190dfcee964ee03a3fb9b9b38cd6a20b3" Mar 21 05:12:44 crc kubenswrapper[4775]: I0321 05:12:44.007495 4775 scope.go:117] "RemoveContainer" containerID="8177bcef8f42975589ec3469527727473adafaa3cc6210ffa908111b0912b0d5" Mar 21 05:12:44 crc kubenswrapper[4775]: I0321 05:12:44.049430 4775 scope.go:117] "RemoveContainer" containerID="2acf177da0f2ed7ac0606a427af701da980e32ea57fc8b3636ddd64f2a8a5536" Mar 21 05:13:02 crc kubenswrapper[4775]: I0321 05:13:02.482324 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:13:02 crc kubenswrapper[4775]: I0321 05:13:02.482914 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:13:15 crc kubenswrapper[4775]: I0321 05:13:15.962740 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6kpp4"] Mar 21 05:13:15 crc kubenswrapper[4775]: E0321 05:13:15.964043 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffbca22-6533-4ade-8f1f-eeeda82f159f" containerName="oc" Mar 21 05:13:15 crc kubenswrapper[4775]: I0321 05:13:15.964066 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffbca22-6533-4ade-8f1f-eeeda82f159f" containerName="oc" Mar 21 05:13:15 crc kubenswrapper[4775]: I0321 05:13:15.964416 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffbca22-6533-4ade-8f1f-eeeda82f159f" containerName="oc" Mar 21 05:13:15 crc kubenswrapper[4775]: I0321 05:13:15.966973 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:15 crc kubenswrapper[4775]: I0321 05:13:15.974454 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kpp4"] Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.143513 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-catalog-content\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.143588 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglxd\" (UniqueName: \"kubernetes.io/projected/45493002-fd0c-4afd-b556-846d8cac5002-kube-api-access-rglxd\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.143634 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-utilities\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.245685 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-catalog-content\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.245752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rglxd\" (UniqueName: \"kubernetes.io/projected/45493002-fd0c-4afd-b556-846d8cac5002-kube-api-access-rglxd\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.245789 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-utilities\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.246409 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-catalog-content\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.246492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-utilities\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.271323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglxd\" (UniqueName: \"kubernetes.io/projected/45493002-fd0c-4afd-b556-846d8cac5002-kube-api-access-rglxd\") pod \"certified-operators-6kpp4\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.286765 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:16 crc kubenswrapper[4775]: I0321 05:13:16.918906 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6kpp4"] Mar 21 05:13:17 crc kubenswrapper[4775]: I0321 05:13:17.590686 4775 generic.go:334] "Generic (PLEG): container finished" podID="45493002-fd0c-4afd-b556-846d8cac5002" containerID="cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830" exitCode=0 Mar 21 05:13:17 crc kubenswrapper[4775]: I0321 05:13:17.590987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kpp4" event={"ID":"45493002-fd0c-4afd-b556-846d8cac5002","Type":"ContainerDied","Data":"cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830"} Mar 21 05:13:17 crc kubenswrapper[4775]: I0321 05:13:17.591013 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kpp4" event={"ID":"45493002-fd0c-4afd-b556-846d8cac5002","Type":"ContainerStarted","Data":"f4eddf91f9b502f94f8ea62789271ca7b116cfd53376cf9077858bb64d6d2b97"} Mar 21 05:13:19 crc kubenswrapper[4775]: I0321 05:13:19.615352 4775 generic.go:334] "Generic (PLEG): container finished" podID="45493002-fd0c-4afd-b556-846d8cac5002" containerID="a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4" exitCode=0 Mar 21 05:13:19 crc kubenswrapper[4775]: I0321 05:13:19.615416 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kpp4" event={"ID":"45493002-fd0c-4afd-b556-846d8cac5002","Type":"ContainerDied","Data":"a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4"} Mar 21 05:13:21 crc kubenswrapper[4775]: I0321 05:13:21.636935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kpp4" event={"ID":"45493002-fd0c-4afd-b556-846d8cac5002","Type":"ContainerStarted","Data":"c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8"} Mar 21 05:13:21 crc kubenswrapper[4775]: I0321 05:13:21.660354 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6kpp4" podStartSLOduration=3.870984982 podStartE2EDuration="6.660332906s" podCreationTimestamp="2026-03-21 05:13:15 +0000 UTC" firstStartedPulling="2026-03-21 05:13:17.592171854 +0000 UTC m=+1550.568635478" lastFinishedPulling="2026-03-21 05:13:20.381519778 +0000 UTC m=+1553.357983402" observedRunningTime="2026-03-21 05:13:21.656826248 +0000 UTC m=+1554.633289882" watchObservedRunningTime="2026-03-21 05:13:21.660332906 +0000 UTC m=+1554.636796540" Mar 21 05:13:26 crc kubenswrapper[4775]: I0321 05:13:26.287605 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:26 crc kubenswrapper[4775]: I0321 05:13:26.288081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:26 crc kubenswrapper[4775]: I0321 05:13:26.336681 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:26 crc kubenswrapper[4775]: I0321 05:13:26.720473 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:26 crc kubenswrapper[4775]: I0321 05:13:26.769105 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kpp4"] Mar 21 05:13:28 crc kubenswrapper[4775]: I0321 05:13:28.692930 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6kpp4" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="registry-server" containerID="cri-o://c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8" gracePeriod=2 Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.182461 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.321692 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-catalog-content\") pod \"45493002-fd0c-4afd-b556-846d8cac5002\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.321949 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rglxd\" (UniqueName: \"kubernetes.io/projected/45493002-fd0c-4afd-b556-846d8cac5002-kube-api-access-rglxd\") pod \"45493002-fd0c-4afd-b556-846d8cac5002\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.322107 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-utilities\") pod \"45493002-fd0c-4afd-b556-846d8cac5002\" (UID: \"45493002-fd0c-4afd-b556-846d8cac5002\") " Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.322806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-utilities" (OuterVolumeSpecName: "utilities") pod "45493002-fd0c-4afd-b556-846d8cac5002" (UID: "45493002-fd0c-4afd-b556-846d8cac5002"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.327949 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45493002-fd0c-4afd-b556-846d8cac5002-kube-api-access-rglxd" (OuterVolumeSpecName: "kube-api-access-rglxd") pod "45493002-fd0c-4afd-b556-846d8cac5002" (UID: "45493002-fd0c-4afd-b556-846d8cac5002"). InnerVolumeSpecName "kube-api-access-rglxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.377751 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45493002-fd0c-4afd-b556-846d8cac5002" (UID: "45493002-fd0c-4afd-b556-846d8cac5002"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.424935 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.424999 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rglxd\" (UniqueName: \"kubernetes.io/projected/45493002-fd0c-4afd-b556-846d8cac5002-kube-api-access-rglxd\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.425070 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45493002-fd0c-4afd-b556-846d8cac5002-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.703145 4775 generic.go:334] "Generic (PLEG): container finished" podID="45493002-fd0c-4afd-b556-846d8cac5002" containerID="c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8" exitCode=0 Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.703200 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6kpp4" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.703199 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kpp4" event={"ID":"45493002-fd0c-4afd-b556-846d8cac5002","Type":"ContainerDied","Data":"c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8"} Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.703339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6kpp4" event={"ID":"45493002-fd0c-4afd-b556-846d8cac5002","Type":"ContainerDied","Data":"f4eddf91f9b502f94f8ea62789271ca7b116cfd53376cf9077858bb64d6d2b97"} Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.703361 4775 scope.go:117] "RemoveContainer" containerID="c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.725864 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6kpp4"] Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.729430 4775 scope.go:117] "RemoveContainer" containerID="a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.735629 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6kpp4"] Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.762426 4775 scope.go:117] "RemoveContainer" containerID="cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.813445 4775 scope.go:117] "RemoveContainer" containerID="c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8" Mar 21 05:13:29 crc kubenswrapper[4775]: E0321 05:13:29.814036 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8\": container with ID starting with c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8 not found: ID does not exist" containerID="c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.814067 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8"} err="failed to get container status \"c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8\": rpc error: code = NotFound desc = could not find container \"c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8\": container with ID starting with c4d82c4bdc19edc6f1e2127fa8d601651b0581f9c288d53c6563503a2bf55be8 not found: ID does not exist" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.814088 4775 scope.go:117] "RemoveContainer" containerID="a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4" Mar 21 05:13:29 crc kubenswrapper[4775]: E0321 05:13:29.814474 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4\": container with ID starting with a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4 not found: ID does not exist" containerID="a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.814529 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4"} err="failed to get container status \"a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4\": rpc error: code = NotFound desc = could not find container \"a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4\": container with ID starting with a6782ced417e87e9facb555ea87dc070f4238ca0a19d2f8a047d54825d30f5c4 not found: ID does not exist" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.814562 4775 scope.go:117] "RemoveContainer" containerID="cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830" Mar 21 05:13:29 crc kubenswrapper[4775]: E0321 05:13:29.815182 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830\": container with ID starting with cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830 not found: ID does not exist" containerID="cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830" Mar 21 05:13:29 crc kubenswrapper[4775]: I0321 05:13:29.815249 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830"} err="failed to get container status \"cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830\": rpc error: code = NotFound desc = could not find container \"cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830\": container with ID starting with cdca173b9c8744ded95289b25e21f20b4cd227c1dd0864ebe8f6347181032830 not found: ID does not exist" Mar 21 05:13:31 crc kubenswrapper[4775]: I0321 05:13:31.678303 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45493002-fd0c-4afd-b556-846d8cac5002" path="/var/lib/kubelet/pods/45493002-fd0c-4afd-b556-846d8cac5002/volumes" Mar 21 05:13:32 crc kubenswrapper[4775]: I0321 05:13:32.482770 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:13:32 crc kubenswrapper[4775]: I0321 05:13:32.482835 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:13:44 crc kubenswrapper[4775]: I0321 05:13:44.187886 4775 scope.go:117] "RemoveContainer" containerID="fdcd202238496f57430b4ad2beaac05c4b222263df155d2d3bab8e200bbd88c5" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.146440 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567834-bqpk6"] Mar 21 05:14:00 crc kubenswrapper[4775]: E0321 05:14:00.147398 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="extract-content" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.147410 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="extract-content" Mar 21 05:14:00 crc kubenswrapper[4775]: E0321 05:14:00.147432 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="extract-utilities" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.147440 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="extract-utilities" Mar 21 05:14:00 crc kubenswrapper[4775]: E0321 05:14:00.147451 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.147458 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.147637 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="45493002-fd0c-4afd-b556-846d8cac5002" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.148247 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-bqpk6" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.150566 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.151188 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.151343 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.157652 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-bqpk6"] Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.332086 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6c2j\" (UniqueName: \"kubernetes.io/projected/dca04e19-ebb7-47c3-a588-e3dc932857bf-kube-api-access-d6c2j\") pod \"auto-csr-approver-29567834-bqpk6\" (UID: \"dca04e19-ebb7-47c3-a588-e3dc932857bf\") " pod="openshift-infra/auto-csr-approver-29567834-bqpk6" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.433653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6c2j\" (UniqueName: \"kubernetes.io/projected/dca04e19-ebb7-47c3-a588-e3dc932857bf-kube-api-access-d6c2j\") pod \"auto-csr-approver-29567834-bqpk6\" (UID: \"dca04e19-ebb7-47c3-a588-e3dc932857bf\") " pod="openshift-infra/auto-csr-approver-29567834-bqpk6" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.451442 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6c2j\" (UniqueName: \"kubernetes.io/projected/dca04e19-ebb7-47c3-a588-e3dc932857bf-kube-api-access-d6c2j\") pod \"auto-csr-approver-29567834-bqpk6\" (UID: \"dca04e19-ebb7-47c3-a588-e3dc932857bf\") " pod="openshift-infra/auto-csr-approver-29567834-bqpk6" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.466446 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-bqpk6" Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.830136 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-bqpk6"] Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.852140 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:14:00 crc kubenswrapper[4775]: I0321 05:14:00.987904 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-bqpk6" event={"ID":"dca04e19-ebb7-47c3-a588-e3dc932857bf","Type":"ContainerStarted","Data":"8ecd5485d0b1d6515fca91ee9e66c8a4fe237e47bea885e8f4c2f9da20eac9d3"} Mar 21 05:14:02 crc kubenswrapper[4775]: I0321 05:14:02.483744 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:14:02 crc kubenswrapper[4775]: I0321 05:14:02.484812 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:14:02 crc kubenswrapper[4775]: I0321 05:14:02.485168 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:14:02 crc kubenswrapper[4775]: I0321 05:14:02.486360 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:14:02 crc kubenswrapper[4775]: I0321 05:14:02.486435 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" gracePeriod=600 Mar 21 05:14:02 crc kubenswrapper[4775]: E0321 05:14:02.633532 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:14:03 crc kubenswrapper[4775]: I0321 05:14:03.010370 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" exitCode=0 Mar 21 05:14:03 crc kubenswrapper[4775]: I0321 05:14:03.010433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4"} Mar 21 05:14:03 crc kubenswrapper[4775]: I0321 05:14:03.010471 4775 scope.go:117] "RemoveContainer" containerID="b6e85a7c4acc97394b06df812a396284f471ec9c7f9eee22918e9da54e21feda" Mar 21 05:14:03 crc kubenswrapper[4775]: I0321 05:14:03.011540 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:14:03 crc kubenswrapper[4775]: E0321 05:14:03.011974 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:14:04 crc kubenswrapper[4775]: I0321 05:14:04.022733 4775 generic.go:334] "Generic (PLEG): container finished" podID="dca04e19-ebb7-47c3-a588-e3dc932857bf" containerID="a1c552cc0e8c345fb5a90c312288f579e557c98ae97907195961586729f792ec" exitCode=0 Mar 21 05:14:04 crc kubenswrapper[4775]: I0321 05:14:04.022788 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-bqpk6" event={"ID":"dca04e19-ebb7-47c3-a588-e3dc932857bf","Type":"ContainerDied","Data":"a1c552cc0e8c345fb5a90c312288f579e557c98ae97907195961586729f792ec"} Mar 21 05:14:05 crc kubenswrapper[4775]: I0321 05:14:05.370087 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-bqpk6" Mar 21 05:14:05 crc kubenswrapper[4775]: I0321 05:14:05.443001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6c2j\" (UniqueName: \"kubernetes.io/projected/dca04e19-ebb7-47c3-a588-e3dc932857bf-kube-api-access-d6c2j\") pod \"dca04e19-ebb7-47c3-a588-e3dc932857bf\" (UID: \"dca04e19-ebb7-47c3-a588-e3dc932857bf\") " Mar 21 05:14:05 crc kubenswrapper[4775]: I0321 05:14:05.448818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca04e19-ebb7-47c3-a588-e3dc932857bf-kube-api-access-d6c2j" (OuterVolumeSpecName: "kube-api-access-d6c2j") pod "dca04e19-ebb7-47c3-a588-e3dc932857bf" (UID: "dca04e19-ebb7-47c3-a588-e3dc932857bf"). InnerVolumeSpecName "kube-api-access-d6c2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:05 crc kubenswrapper[4775]: I0321 05:14:05.545755 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6c2j\" (UniqueName: \"kubernetes.io/projected/dca04e19-ebb7-47c3-a588-e3dc932857bf-kube-api-access-d6c2j\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:06 crc kubenswrapper[4775]: I0321 05:14:06.047779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-bqpk6" event={"ID":"dca04e19-ebb7-47c3-a588-e3dc932857bf","Type":"ContainerDied","Data":"8ecd5485d0b1d6515fca91ee9e66c8a4fe237e47bea885e8f4c2f9da20eac9d3"} Mar 21 05:14:06 crc kubenswrapper[4775]: I0321 05:14:06.047823 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ecd5485d0b1d6515fca91ee9e66c8a4fe237e47bea885e8f4c2f9da20eac9d3" Mar 21 05:14:06 crc kubenswrapper[4775]: I0321 05:14:06.047837 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-bqpk6" Mar 21 05:14:06 crc kubenswrapper[4775]: I0321 05:14:06.437314 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-6klnl"] Mar 21 05:14:06 crc kubenswrapper[4775]: I0321 05:14:06.446452 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-6klnl"] Mar 21 05:14:07 crc kubenswrapper[4775]: I0321 05:14:07.670781 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eccf102b-f91a-4251-b878-1c21eea92522" path="/var/lib/kubelet/pods/eccf102b-f91a-4251-b878-1c21eea92522/volumes" Mar 21 05:14:17 crc kubenswrapper[4775]: I0321 05:14:17.670766 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:14:17 crc kubenswrapper[4775]: E0321 05:14:17.671958 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:14:31 crc kubenswrapper[4775]: I0321 05:14:31.661595 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:14:31 crc kubenswrapper[4775]: E0321 05:14:31.662515 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:14:44 crc kubenswrapper[4775]: I0321 05:14:44.279058 4775 scope.go:117] "RemoveContainer" containerID="bca1df983777352aa51a410cf20699daccf33fb2338a4057481b7410153afd88" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.392087 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pjdkb"] Mar 21 05:14:45 crc kubenswrapper[4775]: E0321 05:14:45.392579 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca04e19-ebb7-47c3-a588-e3dc932857bf" containerName="oc" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.392594 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca04e19-ebb7-47c3-a588-e3dc932857bf" containerName="oc" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.392853 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca04e19-ebb7-47c3-a588-e3dc932857bf" containerName="oc" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.394499 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.419360 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pjdkb"] Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.478275 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-catalog-content\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.478393 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-utilities\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.478416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmlgt\" (UniqueName: \"kubernetes.io/projected/20a64964-7adb-48e9-93f6-979650e14968-kube-api-access-tmlgt\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.579937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-catalog-content\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.580320 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-utilities\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.580352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmlgt\" (UniqueName: \"kubernetes.io/projected/20a64964-7adb-48e9-93f6-979650e14968-kube-api-access-tmlgt\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.580524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-catalog-content\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.580753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-utilities\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.605976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmlgt\" (UniqueName: \"kubernetes.io/projected/20a64964-7adb-48e9-93f6-979650e14968-kube-api-access-tmlgt\") pod \"community-operators-pjdkb\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.663820 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:14:45 crc kubenswrapper[4775]: E0321 05:14:45.664028 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:14:45 crc kubenswrapper[4775]: I0321 05:14:45.728703 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:46 crc kubenswrapper[4775]: I0321 05:14:46.251873 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pjdkb"] Mar 21 05:14:46 crc kubenswrapper[4775]: I0321 05:14:46.447734 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjdkb" event={"ID":"20a64964-7adb-48e9-93f6-979650e14968","Type":"ContainerStarted","Data":"7a4739de3c9c5b6e3a614786f2508babd568fd35bee8f59c0d8d1429043ef547"} Mar 21 05:14:47 crc kubenswrapper[4775]: I0321 05:14:47.459243 4775 generic.go:334] "Generic (PLEG): container finished" podID="20a64964-7adb-48e9-93f6-979650e14968" containerID="b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d" exitCode=0 Mar 21 05:14:47 crc kubenswrapper[4775]: I0321 05:14:47.459338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjdkb" event={"ID":"20a64964-7adb-48e9-93f6-979650e14968","Type":"ContainerDied","Data":"b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d"} Mar 21 05:14:48 crc kubenswrapper[4775]: I0321 05:14:48.470559 4775 generic.go:334] "Generic (PLEG): container finished" podID="20a64964-7adb-48e9-93f6-979650e14968" containerID="54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4" exitCode=0 Mar 21 05:14:48 crc kubenswrapper[4775]: I0321 05:14:48.470661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjdkb" event={"ID":"20a64964-7adb-48e9-93f6-979650e14968","Type":"ContainerDied","Data":"54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4"} Mar 21 05:14:49 crc kubenswrapper[4775]: I0321 05:14:49.481108 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjdkb" event={"ID":"20a64964-7adb-48e9-93f6-979650e14968","Type":"ContainerStarted","Data":"a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d"} Mar 21 05:14:49 crc kubenswrapper[4775]: I0321 05:14:49.497207 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pjdkb" podStartSLOduration=2.986797348 podStartE2EDuration="4.497185421s" podCreationTimestamp="2026-03-21 05:14:45 +0000 UTC" firstStartedPulling="2026-03-21 05:14:47.461700639 +0000 UTC m=+1640.438164263" lastFinishedPulling="2026-03-21 05:14:48.972088712 +0000 UTC m=+1641.948552336" observedRunningTime="2026-03-21 05:14:49.495999407 +0000 UTC m=+1642.472463041" watchObservedRunningTime="2026-03-21 05:14:49.497185421 +0000 UTC m=+1642.473649045" Mar 21 05:14:55 crc kubenswrapper[4775]: I0321 05:14:55.729802 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:55 crc kubenswrapper[4775]: I0321 05:14:55.730208 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:55 crc kubenswrapper[4775]: I0321 05:14:55.776575 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:56 crc kubenswrapper[4775]: I0321 05:14:56.581558 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:56 crc kubenswrapper[4775]: I0321 05:14:56.649803 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pjdkb"] Mar 21 05:14:56 crc kubenswrapper[4775]: I0321 05:14:56.661564 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:14:56 crc kubenswrapper[4775]: E0321 05:14:56.661828 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:14:58 crc kubenswrapper[4775]: I0321 05:14:58.558987 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pjdkb" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="registry-server" containerID="cri-o://a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d" gracePeriod=2 Mar 21 05:14:58 crc kubenswrapper[4775]: I0321 05:14:58.958802 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.152336 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-catalog-content\") pod \"20a64964-7adb-48e9-93f6-979650e14968\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.152518 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-utilities\") pod \"20a64964-7adb-48e9-93f6-979650e14968\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.152600 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmlgt\" (UniqueName: \"kubernetes.io/projected/20a64964-7adb-48e9-93f6-979650e14968-kube-api-access-tmlgt\") pod \"20a64964-7adb-48e9-93f6-979650e14968\" (UID: \"20a64964-7adb-48e9-93f6-979650e14968\") " Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.153205 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-utilities" (OuterVolumeSpecName: "utilities") pod "20a64964-7adb-48e9-93f6-979650e14968" (UID: "20a64964-7adb-48e9-93f6-979650e14968"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.164021 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a64964-7adb-48e9-93f6-979650e14968-kube-api-access-tmlgt" (OuterVolumeSpecName: "kube-api-access-tmlgt") pod "20a64964-7adb-48e9-93f6-979650e14968" (UID: "20a64964-7adb-48e9-93f6-979650e14968"). InnerVolumeSpecName "kube-api-access-tmlgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.209782 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20a64964-7adb-48e9-93f6-979650e14968" (UID: "20a64964-7adb-48e9-93f6-979650e14968"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.255467 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.255502 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a64964-7adb-48e9-93f6-979650e14968-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.255512 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmlgt\" (UniqueName: \"kubernetes.io/projected/20a64964-7adb-48e9-93f6-979650e14968-kube-api-access-tmlgt\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.573938 4775 generic.go:334] "Generic (PLEG): container finished" podID="20a64964-7adb-48e9-93f6-979650e14968" containerID="a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d" exitCode=0 Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.573991 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjdkb" event={"ID":"20a64964-7adb-48e9-93f6-979650e14968","Type":"ContainerDied","Data":"a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d"} Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.574027 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pjdkb" event={"ID":"20a64964-7adb-48e9-93f6-979650e14968","Type":"ContainerDied","Data":"7a4739de3c9c5b6e3a614786f2508babd568fd35bee8f59c0d8d1429043ef547"} Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.574046 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pjdkb" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.574054 4775 scope.go:117] "RemoveContainer" containerID="a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.617695 4775 scope.go:117] "RemoveContainer" containerID="54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.620094 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pjdkb"] Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.629522 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pjdkb"] Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.649799 4775 scope.go:117] "RemoveContainer" containerID="b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.674071 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a64964-7adb-48e9-93f6-979650e14968" path="/var/lib/kubelet/pods/20a64964-7adb-48e9-93f6-979650e14968/volumes" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.693379 4775 scope.go:117] "RemoveContainer" containerID="a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d" Mar 21 05:14:59 crc kubenswrapper[4775]: E0321 05:14:59.693816 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d\": container with ID starting with a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d not found: ID does not exist" containerID="a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.693871 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d"} err="failed to get container status \"a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d\": rpc error: code = NotFound desc = could not find container \"a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d\": container with ID starting with a566da79886c1bc3d992c8c0d8bbadee250657cb273d16a611341ef58c4a8b7d not found: ID does not exist" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.693920 4775 scope.go:117] "RemoveContainer" containerID="54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4" Mar 21 05:14:59 crc kubenswrapper[4775]: E0321 05:14:59.694341 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4\": container with ID starting with 54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4 not found: ID does not exist" containerID="54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.694370 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4"} err="failed to get container status \"54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4\": rpc error: code = NotFound desc = could not find container \"54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4\": container with ID starting with 54f2e589f96425fcbb5095c069f909848ce69dc1edfa225e2b3eaa7471d513e4 not found: ID does not exist" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.694392 4775 scope.go:117] "RemoveContainer" containerID="b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d" Mar 21 05:14:59 crc kubenswrapper[4775]: E0321 05:14:59.694805 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d\": container with ID starting with b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d not found: ID does not exist" containerID="b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d" Mar 21 05:14:59 crc kubenswrapper[4775]: I0321 05:14:59.694835 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d"} err="failed to get container status \"b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d\": rpc error: code = NotFound desc = could not find container \"b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d\": container with ID starting with b7af7cafa1c2a037dbe9f05f5b2616050b352fe6dd4b77bccbb65fd568e4855d not found: ID does not exist" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.151829 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx"] Mar 21 05:15:00 crc kubenswrapper[4775]: E0321 05:15:00.152457 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="registry-server" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.152486 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="registry-server" Mar 21 05:15:00 crc kubenswrapper[4775]: E0321 05:15:00.152512 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="extract-content" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.152520 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="extract-content" Mar 21 05:15:00 crc kubenswrapper[4775]: E0321 05:15:00.152535 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="extract-utilities" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.152543 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="extract-utilities" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.152987 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a64964-7adb-48e9-93f6-979650e14968" containerName="registry-server" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.154658 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.160771 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx"] Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.161655 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.161954 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.275719 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bebab26-7b09-4e00-adee-fffa3c06f5ab-secret-volume\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.276147 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bebab26-7b09-4e00-adee-fffa3c06f5ab-config-volume\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.276252 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb749\" (UniqueName: \"kubernetes.io/projected/8bebab26-7b09-4e00-adee-fffa3c06f5ab-kube-api-access-kb749\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.378384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bebab26-7b09-4e00-adee-fffa3c06f5ab-config-volume\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.378514 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb749\" (UniqueName: \"kubernetes.io/projected/8bebab26-7b09-4e00-adee-fffa3c06f5ab-kube-api-access-kb749\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.378570 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bebab26-7b09-4e00-adee-fffa3c06f5ab-secret-volume\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.379380 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bebab26-7b09-4e00-adee-fffa3c06f5ab-config-volume\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.384927 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bebab26-7b09-4e00-adee-fffa3c06f5ab-secret-volume\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.399110 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb749\" (UniqueName: \"kubernetes.io/projected/8bebab26-7b09-4e00-adee-fffa3c06f5ab-kube-api-access-kb749\") pod \"collect-profiles-29567835-47dnx\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.489932 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:00 crc kubenswrapper[4775]: I0321 05:15:00.949798 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx"] Mar 21 05:15:01 crc kubenswrapper[4775]: I0321 05:15:01.623675 4775 generic.go:334] "Generic (PLEG): container finished" podID="8bebab26-7b09-4e00-adee-fffa3c06f5ab" containerID="c3748d6274f12563ae1762dcd4c9c668e2bf7b8aecb55d574f50d517af661d16" exitCode=0 Mar 21 05:15:01 crc kubenswrapper[4775]: I0321 05:15:01.623755 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" event={"ID":"8bebab26-7b09-4e00-adee-fffa3c06f5ab","Type":"ContainerDied","Data":"c3748d6274f12563ae1762dcd4c9c668e2bf7b8aecb55d574f50d517af661d16"} Mar 21 05:15:01 crc kubenswrapper[4775]: I0321 05:15:01.623946 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" event={"ID":"8bebab26-7b09-4e00-adee-fffa3c06f5ab","Type":"ContainerStarted","Data":"4e852416a815e8ad2197d4a13c6cdcc4844025c71374e05497715f2c9a4e7420"} Mar 21 05:15:01 crc kubenswrapper[4775]: I0321 05:15:01.626818 4775 generic.go:334] "Generic (PLEG): container finished" podID="203df932-0574-4098-b897-ba50813f2ec1" containerID="e83db7ccdc230f6a7c372f2ab3e28ef862efd0646a039a1d8f53714e19540048" exitCode=0 Mar 21 05:15:01 crc kubenswrapper[4775]: I0321 05:15:01.626870 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" event={"ID":"203df932-0574-4098-b897-ba50813f2ec1","Type":"ContainerDied","Data":"e83db7ccdc230f6a7c372f2ab3e28ef862efd0646a039a1d8f53714e19540048"} Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.051798 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.059069 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.138096 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sfxh\" (UniqueName: \"kubernetes.io/projected/203df932-0574-4098-b897-ba50813f2ec1-kube-api-access-5sfxh\") pod \"203df932-0574-4098-b897-ba50813f2ec1\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.138216 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb749\" (UniqueName: \"kubernetes.io/projected/8bebab26-7b09-4e00-adee-fffa3c06f5ab-kube-api-access-kb749\") pod \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.138305 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bebab26-7b09-4e00-adee-fffa3c06f5ab-secret-volume\") pod \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.138526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bebab26-7b09-4e00-adee-fffa3c06f5ab-config-volume\") pod \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\" (UID: \"8bebab26-7b09-4e00-adee-fffa3c06f5ab\") " Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.138624 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-inventory\") pod \"203df932-0574-4098-b897-ba50813f2ec1\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.138739 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-bootstrap-combined-ca-bundle\") pod \"203df932-0574-4098-b897-ba50813f2ec1\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.138778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-ssh-key-openstack-edpm-ipam\") pod \"203df932-0574-4098-b897-ba50813f2ec1\" (UID: \"203df932-0574-4098-b897-ba50813f2ec1\") " Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.139605 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bebab26-7b09-4e00-adee-fffa3c06f5ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bebab26-7b09-4e00-adee-fffa3c06f5ab" (UID: "8bebab26-7b09-4e00-adee-fffa3c06f5ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.144716 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bebab26-7b09-4e00-adee-fffa3c06f5ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8bebab26-7b09-4e00-adee-fffa3c06f5ab" (UID: "8bebab26-7b09-4e00-adee-fffa3c06f5ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.145004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bebab26-7b09-4e00-adee-fffa3c06f5ab-kube-api-access-kb749" (OuterVolumeSpecName: "kube-api-access-kb749") pod "8bebab26-7b09-4e00-adee-fffa3c06f5ab" (UID: "8bebab26-7b09-4e00-adee-fffa3c06f5ab"). InnerVolumeSpecName "kube-api-access-kb749". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.145058 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203df932-0574-4098-b897-ba50813f2ec1-kube-api-access-5sfxh" (OuterVolumeSpecName: "kube-api-access-5sfxh") pod "203df932-0574-4098-b897-ba50813f2ec1" (UID: "203df932-0574-4098-b897-ba50813f2ec1"). InnerVolumeSpecName "kube-api-access-5sfxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.145509 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "203df932-0574-4098-b897-ba50813f2ec1" (UID: "203df932-0574-4098-b897-ba50813f2ec1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.168835 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-inventory" (OuterVolumeSpecName: "inventory") pod "203df932-0574-4098-b897-ba50813f2ec1" (UID: "203df932-0574-4098-b897-ba50813f2ec1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.182094 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "203df932-0574-4098-b897-ba50813f2ec1" (UID: "203df932-0574-4098-b897-ba50813f2ec1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.240607 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.240646 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.240657 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/203df932-0574-4098-b897-ba50813f2ec1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.240667 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sfxh\" (UniqueName: \"kubernetes.io/projected/203df932-0574-4098-b897-ba50813f2ec1-kube-api-access-5sfxh\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.240676 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb749\" (UniqueName: \"kubernetes.io/projected/8bebab26-7b09-4e00-adee-fffa3c06f5ab-kube-api-access-kb749\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.240684 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bebab26-7b09-4e00-adee-fffa3c06f5ab-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.240693 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bebab26-7b09-4e00-adee-fffa3c06f5ab-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.648650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" event={"ID":"8bebab26-7b09-4e00-adee-fffa3c06f5ab","Type":"ContainerDied","Data":"4e852416a815e8ad2197d4a13c6cdcc4844025c71374e05497715f2c9a4e7420"} Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.648982 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e852416a815e8ad2197d4a13c6cdcc4844025c71374e05497715f2c9a4e7420" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.648672 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.650600 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" event={"ID":"203df932-0574-4098-b897-ba50813f2ec1","Type":"ContainerDied","Data":"3c801bc6a121fd4faebee72f398b0bcdeabcebcace5ab4c44442f937b77950dc"} Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.650639 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c801bc6a121fd4faebee72f398b0bcdeabcebcace5ab4c44442f937b77950dc" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.650680 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.753338 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr"] Mar 21 05:15:03 crc kubenswrapper[4775]: E0321 05:15:03.753832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203df932-0574-4098-b897-ba50813f2ec1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.753857 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="203df932-0574-4098-b897-ba50813f2ec1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 05:15:03 crc kubenswrapper[4775]: E0321 05:15:03.753900 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bebab26-7b09-4e00-adee-fffa3c06f5ab" containerName="collect-profiles" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.753914 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bebab26-7b09-4e00-adee-fffa3c06f5ab" containerName="collect-profiles" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.754155 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bebab26-7b09-4e00-adee-fffa3c06f5ab" containerName="collect-profiles" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.754190 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="203df932-0574-4098-b897-ba50813f2ec1" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.755010 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.757516 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.757815 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.758439 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.758699 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.771867 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr"] Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.850663 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.850794 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.850891 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps8td\" (UniqueName: \"kubernetes.io/projected/28040d61-c9ea-4a55-b113-db871dff679c-kube-api-access-ps8td\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.956110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.956397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.956562 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps8td\" (UniqueName: \"kubernetes.io/projected/28040d61-c9ea-4a55-b113-db871dff679c-kube-api-access-ps8td\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.962553 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.966768 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:03 crc kubenswrapper[4775]: I0321 05:15:03.984083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps8td\" (UniqueName: \"kubernetes.io/projected/28040d61-c9ea-4a55-b113-db871dff679c-kube-api-access-ps8td\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-h87nr\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:04 crc kubenswrapper[4775]: I0321 05:15:04.089432 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:15:04 crc kubenswrapper[4775]: I0321 05:15:04.630552 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr"] Mar 21 05:15:04 crc kubenswrapper[4775]: I0321 05:15:04.659616 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" event={"ID":"28040d61-c9ea-4a55-b113-db871dff679c","Type":"ContainerStarted","Data":"ef0fcdf856bb63344889100ab23ff6e3c752ecddd8029d7da572bb02105b88a6"} Mar 21 05:15:06 crc kubenswrapper[4775]: I0321 05:15:06.680889 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" event={"ID":"28040d61-c9ea-4a55-b113-db871dff679c","Type":"ContainerStarted","Data":"4f6b82cb9ea1aa1f200cad1ffb7a707dc8d8464ef48ddbc1ece993160bc193ff"} Mar 21 05:15:06 crc kubenswrapper[4775]: I0321 05:15:06.702518 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" podStartSLOduration=2.357774997 podStartE2EDuration="3.702497304s" podCreationTimestamp="2026-03-21 05:15:03 +0000 UTC" firstStartedPulling="2026-03-21 05:15:04.632323881 +0000 UTC m=+1657.608787505" lastFinishedPulling="2026-03-21 05:15:05.977046178 +0000 UTC m=+1658.953509812" observedRunningTime="2026-03-21 05:15:06.692642308 +0000 UTC m=+1659.669105932" watchObservedRunningTime="2026-03-21 05:15:06.702497304 +0000 UTC m=+1659.678960928" Mar 21 05:15:10 crc kubenswrapper[4775]: I0321 05:15:10.661555 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:15:10 crc kubenswrapper[4775]: E0321 05:15:10.662420 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:15:22 crc kubenswrapper[4775]: I0321 05:15:22.661423 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:15:22 crc kubenswrapper[4775]: E0321 05:15:22.662136 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:15:37 crc kubenswrapper[4775]: I0321 05:15:37.671343 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:15:37 crc kubenswrapper[4775]: E0321 05:15:37.672160 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:15:44 crc kubenswrapper[4775]: I0321 05:15:44.383779 4775 scope.go:117] "RemoveContainer" containerID="3d41a9111ff02a22981deef2e158c7aaae04c761a3f477f9cde2ecff8054c0f1" Mar 21 05:15:44 crc kubenswrapper[4775]: I0321 05:15:44.421986 4775 scope.go:117] "RemoveContainer" containerID="992670c46bdc3ecfcf02d509f68efcc94de86d4d365af73eee5b378735e764a6" Mar 21 05:15:44 crc kubenswrapper[4775]: I0321 05:15:44.451778 4775 scope.go:117] "RemoveContainer" containerID="a3e8af6598015c259cd8e37d94c33e2e14ca3a07bdbbfc459894647fdb9e40ea" Mar 21 05:15:44 crc kubenswrapper[4775]: I0321 05:15:44.515073 4775 scope.go:117] "RemoveContainer" containerID="210c52f72b30e6d06b1636b7f4ba7c385a206b219bca52c176d9da81b984a93a" Mar 21 05:15:48 crc kubenswrapper[4775]: I0321 05:15:48.662674 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:15:48 crc kubenswrapper[4775]: E0321 05:15:48.663565 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.168299 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567836-whhlh"] Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.170876 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-whhlh" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.174547 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.177750 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.177854 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.188203 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-whhlh"] Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.329436 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlxw\" (UniqueName: \"kubernetes.io/projected/4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc-kube-api-access-hvlxw\") pod \"auto-csr-approver-29567836-whhlh\" (UID: \"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc\") " pod="openshift-infra/auto-csr-approver-29567836-whhlh" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.431795 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlxw\" (UniqueName: \"kubernetes.io/projected/4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc-kube-api-access-hvlxw\") pod \"auto-csr-approver-29567836-whhlh\" (UID: \"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc\") " pod="openshift-infra/auto-csr-approver-29567836-whhlh" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.455567 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlxw\" (UniqueName: \"kubernetes.io/projected/4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc-kube-api-access-hvlxw\") pod \"auto-csr-approver-29567836-whhlh\" (UID: \"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc\") " pod="openshift-infra/auto-csr-approver-29567836-whhlh" Mar 21 05:16:00 crc kubenswrapper[4775]: I0321 05:16:00.497850 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-whhlh" Mar 21 05:16:01 crc kubenswrapper[4775]: I0321 05:16:01.068896 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-whhlh"] Mar 21 05:16:01 crc kubenswrapper[4775]: I0321 05:16:01.257585 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-whhlh" event={"ID":"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc","Type":"ContainerStarted","Data":"f61d14db2b5b45ca660eea6cc2d22453b0890440d252c939cb8c03e9cda1c42a"} Mar 21 05:16:01 crc kubenswrapper[4775]: I0321 05:16:01.661897 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:16:01 crc kubenswrapper[4775]: E0321 05:16:01.662343 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:16:04 crc kubenswrapper[4775]: I0321 05:16:04.291725 4775 generic.go:334] "Generic (PLEG): container finished" podID="4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc" containerID="f116b6d526cd4aaa616f201bcf5084943e1e56754ce61bd36026bd8732dd3bd1" exitCode=0 Mar 21 05:16:04 crc kubenswrapper[4775]: I0321 05:16:04.291783 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-whhlh" event={"ID":"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc","Type":"ContainerDied","Data":"f116b6d526cd4aaa616f201bcf5084943e1e56754ce61bd36026bd8732dd3bd1"} Mar 21 05:16:05 crc kubenswrapper[4775]: I0321 05:16:05.830825 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-whhlh" Mar 21 05:16:05 crc kubenswrapper[4775]: I0321 05:16:05.904834 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvlxw\" (UniqueName: \"kubernetes.io/projected/4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc-kube-api-access-hvlxw\") pod \"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc\" (UID: \"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc\") " Mar 21 05:16:05 crc kubenswrapper[4775]: I0321 05:16:05.919211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc-kube-api-access-hvlxw" (OuterVolumeSpecName: "kube-api-access-hvlxw") pod "4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc" (UID: "4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc"). InnerVolumeSpecName "kube-api-access-hvlxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:06 crc kubenswrapper[4775]: I0321 05:16:06.009077 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvlxw\" (UniqueName: \"kubernetes.io/projected/4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc-kube-api-access-hvlxw\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:06 crc kubenswrapper[4775]: I0321 05:16:06.314790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-whhlh" event={"ID":"4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc","Type":"ContainerDied","Data":"f61d14db2b5b45ca660eea6cc2d22453b0890440d252c939cb8c03e9cda1c42a"} Mar 21 05:16:06 crc kubenswrapper[4775]: I0321 05:16:06.315092 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f61d14db2b5b45ca660eea6cc2d22453b0890440d252c939cb8c03e9cda1c42a" Mar 21 05:16:06 crc kubenswrapper[4775]: I0321 05:16:06.314952 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-whhlh" Mar 21 05:16:06 crc kubenswrapper[4775]: I0321 05:16:06.921417 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-jv78x"] Mar 21 05:16:06 crc kubenswrapper[4775]: I0321 05:16:06.933669 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-jv78x"] Mar 21 05:16:07 crc kubenswrapper[4775]: I0321 05:16:07.678946 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33cb4f91-aca0-4221-873f-e2c16e30ccee" path="/var/lib/kubelet/pods/33cb4f91-aca0-4221-873f-e2c16e30ccee/volumes" Mar 21 05:16:11 crc kubenswrapper[4775]: I0321 05:16:11.042545 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-fmfms"] Mar 21 05:16:11 crc kubenswrapper[4775]: I0321 05:16:11.051574 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-fmfms"] Mar 21 05:16:11 crc kubenswrapper[4775]: I0321 05:16:11.675503 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a151b65-5187-48d2-a29a-30eae42c179c" path="/var/lib/kubelet/pods/3a151b65-5187-48d2-a29a-30eae42c179c/volumes" Mar 21 05:16:12 crc kubenswrapper[4775]: I0321 05:16:12.054094 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6de1-account-create-update-bkzxl"] Mar 21 05:16:12 crc kubenswrapper[4775]: I0321 05:16:12.067408 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-594f-account-create-update-9ntmw"] Mar 21 05:16:12 crc kubenswrapper[4775]: I0321 05:16:12.078999 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6de1-account-create-update-bkzxl"] Mar 21 05:16:12 crc kubenswrapper[4775]: I0321 05:16:12.090804 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-429fj"] Mar 21 05:16:12 crc kubenswrapper[4775]: I0321 05:16:12.102598 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-594f-account-create-update-9ntmw"] Mar 21 05:16:12 crc kubenswrapper[4775]: I0321 05:16:12.113661 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-429fj"] Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.042535 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-41dd-account-create-update-5ltn6"] Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.058039 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-j56nc"] Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.071847 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-j56nc"] Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.083326 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-41dd-account-create-update-5ltn6"] Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.675912 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6" path="/var/lib/kubelet/pods/1623bfa5-c196-4b3f-b1ad-2e895cb6e6d6/volumes" Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.676684 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e15fc9-dccb-44f6-9266-2bfe23d9e224" path="/var/lib/kubelet/pods/17e15fc9-dccb-44f6-9266-2bfe23d9e224/volumes" Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.677746 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88db3052-d0c3-4f00-a116-aea162a7790b" path="/var/lib/kubelet/pods/88db3052-d0c3-4f00-a116-aea162a7790b/volumes" Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.678513 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ecd8b9-b671-4181-96e9-adf3e820c8c7" path="/var/lib/kubelet/pods/a5ecd8b9-b671-4181-96e9-adf3e820c8c7/volumes" Mar 21 05:16:13 crc kubenswrapper[4775]: I0321 05:16:13.679308 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c83c73-8982-4c8e-bdeb-204a4874162c" path="/var/lib/kubelet/pods/a8c83c73-8982-4c8e-bdeb-204a4874162c/volumes" Mar 21 05:16:14 crc kubenswrapper[4775]: I0321 05:16:14.662832 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:16:14 crc kubenswrapper[4775]: E0321 05:16:14.663697 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:16:25 crc kubenswrapper[4775]: I0321 05:16:25.663557 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:16:25 crc kubenswrapper[4775]: E0321 05:16:25.664839 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:16:32 crc kubenswrapper[4775]: I0321 05:16:32.045914 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-drcfj"] Mar 21 05:16:32 crc kubenswrapper[4775]: I0321 05:16:32.059756 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-drcfj"] Mar 21 05:16:33 crc kubenswrapper[4775]: I0321 05:16:33.676173 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c6072b-3aa5-43ef-be24-f9b20e5095bd" path="/var/lib/kubelet/pods/b4c6072b-3aa5-43ef-be24-f9b20e5095bd/volumes" Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.040299 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-x5t7w"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.052922 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b34c-account-create-update-6kvs4"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.067300 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b34c-account-create-update-6kvs4"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.079680 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-x5t7w"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.091140 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6ca3-account-create-update-p8m7j"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.100850 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-7a73-account-create-update-jvhc8"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.110391 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rbcrx"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.122625 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-7a73-account-create-update-jvhc8"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.132381 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rbcrx"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.142307 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6ca3-account-create-update-p8m7j"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.151426 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-9vtvz"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.160937 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-9vtvz"] Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.676643 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32831ef2-2e09-453b-9bea-dffe7423fa37" path="/var/lib/kubelet/pods/32831ef2-2e09-453b-9bea-dffe7423fa37/volumes" Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.677432 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74dea94d-fe8b-49e3-9730-fbb641cafaab" path="/var/lib/kubelet/pods/74dea94d-fe8b-49e3-9730-fbb641cafaab/volumes" Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.678145 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762f4dd9-9a96-4cdb-aa87-e181f5959140" path="/var/lib/kubelet/pods/762f4dd9-9a96-4cdb-aa87-e181f5959140/volumes" Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.678768 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6" path="/var/lib/kubelet/pods/94ac9553-1ce0-4c22-9b2a-e7f38b75a6d6/volumes" Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.679939 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b" path="/var/lib/kubelet/pods/9e291ad2-f2c9-4fd6-8ac8-f1e8e53e6a6b/volumes" Mar 21 05:16:37 crc kubenswrapper[4775]: I0321 05:16:37.680636 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3e27f70-6ea1-4f2e-a42e-e65c8ba76147" path="/var/lib/kubelet/pods/d3e27f70-6ea1-4f2e-a42e-e65c8ba76147/volumes" Mar 21 05:16:39 crc kubenswrapper[4775]: I0321 05:16:39.662386 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:16:39 crc kubenswrapper[4775]: E0321 05:16:39.663644 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.612728 4775 scope.go:117] "RemoveContainer" containerID="5d0abdc2d7bcbfcb69006c6e61f11899c9c6ba0f99f80bd6ef0c79dc4299f856" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.649255 4775 scope.go:117] "RemoveContainer" containerID="f3bbbe70d0549a7acec5479ed8da7aa8abe15ef1981c7636975b0a1dd52d35cc" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.701944 4775 scope.go:117] "RemoveContainer" containerID="3d3602be9c806dde89f844d57f5ed730a681b0e2d28bcdfcf3fd72611d393bf0" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.743936 4775 scope.go:117] "RemoveContainer" containerID="05fed25f10a4568226dd529e50e8f49b33dd0e98b0199020ea806a0cb77f54ee" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.792084 4775 scope.go:117] "RemoveContainer" containerID="612cd0e0df303a73de827afa0ae711fb6535100e6b3d945953567e8f21be3662" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.801541 4775 generic.go:334] "Generic (PLEG): container finished" podID="28040d61-c9ea-4a55-b113-db871dff679c" containerID="4f6b82cb9ea1aa1f200cad1ffb7a707dc8d8464ef48ddbc1ece993160bc193ff" exitCode=0 Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.801637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" event={"ID":"28040d61-c9ea-4a55-b113-db871dff679c","Type":"ContainerDied","Data":"4f6b82cb9ea1aa1f200cad1ffb7a707dc8d8464ef48ddbc1ece993160bc193ff"} Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.870085 4775 scope.go:117] "RemoveContainer" containerID="db02c0eaade2846804c5f5491c8f4869f16bd0a0624e9878aa43eba4fd63b5ed" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.912744 4775 scope.go:117] "RemoveContainer" containerID="96a2bef185388797d3fea57d8ac4fefdb91c53837dbce4eb9abcae899aab9b04" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.956048 4775 scope.go:117] "RemoveContainer" containerID="7ca2f730d3508ebef97a36404de20e26217deca533b74b90b3415ce6b5463328" Mar 21 05:16:44 crc kubenswrapper[4775]: I0321 05:16:44.990047 4775 scope.go:117] "RemoveContainer" containerID="e4255198ffff3f324a60a06814346bd92cba238dd6b295fce65cd1387f10c2d6" Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.025873 4775 scope.go:117] "RemoveContainer" containerID="2b7a69f7824c3af4bbbbe389045d25783f39861d479383b702bdac257cb7b2e3" Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.049690 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dl5n2"] Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.061396 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dl5n2"] Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.064294 4775 scope.go:117] "RemoveContainer" containerID="9076af8667b17ded58ed73ceb5a5784857256d7d90cd1bdaecae68562f5cf370" Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.091197 4775 scope.go:117] "RemoveContainer" containerID="a887c3f98f7514e80429067de51d11f1272a6c7ef6bbe82b5e6e6de5b394be30" Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.117686 4775 scope.go:117] "RemoveContainer" containerID="13179901d0030c432b15b9cdcd3d4bcc8af1fc1231684497164b45d09b84bafe" Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.145035 4775 scope.go:117] "RemoveContainer" containerID="9349131633ffa81b213e3981f82d949940d821489cd39e0d255301316c6d40ec" Mar 21 05:16:45 crc kubenswrapper[4775]: I0321 05:16:45.687500 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47698618-487f-4849-b179-34398850f0e0" path="/var/lib/kubelet/pods/47698618-487f-4849-b179-34398850f0e0/volumes" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.302501 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.349782 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory\") pod \"28040d61-c9ea-4a55-b113-db871dff679c\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.350093 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-ssh-key-openstack-edpm-ipam\") pod \"28040d61-c9ea-4a55-b113-db871dff679c\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.350280 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps8td\" (UniqueName: \"kubernetes.io/projected/28040d61-c9ea-4a55-b113-db871dff679c-kube-api-access-ps8td\") pod \"28040d61-c9ea-4a55-b113-db871dff679c\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.356987 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28040d61-c9ea-4a55-b113-db871dff679c-kube-api-access-ps8td" (OuterVolumeSpecName: "kube-api-access-ps8td") pod "28040d61-c9ea-4a55-b113-db871dff679c" (UID: "28040d61-c9ea-4a55-b113-db871dff679c"). InnerVolumeSpecName "kube-api-access-ps8td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:46 crc kubenswrapper[4775]: E0321 05:16:46.384405 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory podName:28040d61-c9ea-4a55-b113-db871dff679c nodeName:}" failed. No retries permitted until 2026-03-21 05:16:46.884379963 +0000 UTC m=+1759.860843587 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory") pod "28040d61-c9ea-4a55-b113-db871dff679c" (UID: "28040d61-c9ea-4a55-b113-db871dff679c") : error deleting /var/lib/kubelet/pods/28040d61-c9ea-4a55-b113-db871dff679c/volume-subpaths: remove /var/lib/kubelet/pods/28040d61-c9ea-4a55-b113-db871dff679c/volume-subpaths: no such file or directory Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.387564 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28040d61-c9ea-4a55-b113-db871dff679c" (UID: "28040d61-c9ea-4a55-b113-db871dff679c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.452838 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.452884 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps8td\" (UniqueName: \"kubernetes.io/projected/28040d61-c9ea-4a55-b113-db871dff679c-kube-api-access-ps8td\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.830289 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" event={"ID":"28040d61-c9ea-4a55-b113-db871dff679c","Type":"ContainerDied","Data":"ef0fcdf856bb63344889100ab23ff6e3c752ecddd8029d7da572bb02105b88a6"} Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.830355 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef0fcdf856bb63344889100ab23ff6e3c752ecddd8029d7da572bb02105b88a6" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.830385 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-h87nr" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.917346 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7"] Mar 21 05:16:46 crc kubenswrapper[4775]: E0321 05:16:46.917828 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28040d61-c9ea-4a55-b113-db871dff679c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.917849 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="28040d61-c9ea-4a55-b113-db871dff679c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 05:16:46 crc kubenswrapper[4775]: E0321 05:16:46.917863 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc" containerName="oc" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.917870 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc" containerName="oc" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.918053 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="28040d61-c9ea-4a55-b113-db871dff679c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.918078 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc" containerName="oc" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.918819 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.940064 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7"] Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.967937 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory\") pod \"28040d61-c9ea-4a55-b113-db871dff679c\" (UID: \"28040d61-c9ea-4a55-b113-db871dff679c\") " Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.968707 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.968902 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpgdj\" (UniqueName: \"kubernetes.io/projected/5efe4255-484c-47d7-800a-4d0dbc5cecd9-kube-api-access-qpgdj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.969012 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:46 crc kubenswrapper[4775]: I0321 05:16:46.976898 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory" (OuterVolumeSpecName: "inventory") pod "28040d61-c9ea-4a55-b113-db871dff679c" (UID: "28040d61-c9ea-4a55-b113-db871dff679c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.071168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpgdj\" (UniqueName: \"kubernetes.io/projected/5efe4255-484c-47d7-800a-4d0dbc5cecd9-kube-api-access-qpgdj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.071314 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.071457 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.071573 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28040d61-c9ea-4a55-b113-db871dff679c-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.075487 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.075487 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.090593 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpgdj\" (UniqueName: \"kubernetes.io/projected/5efe4255-484c-47d7-800a-4d0dbc5cecd9-kube-api-access-qpgdj\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.241064 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:16:47 crc kubenswrapper[4775]: I0321 05:16:47.850406 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7"] Mar 21 05:16:47 crc kubenswrapper[4775]: W0321 05:16:47.863659 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5efe4255_484c_47d7_800a_4d0dbc5cecd9.slice/crio-3d0bf76bdd87a405c59bbe5ef7257f5d0e9718a3bd77269f38cb3200a09aee39 WatchSource:0}: Error finding container 3d0bf76bdd87a405c59bbe5ef7257f5d0e9718a3bd77269f38cb3200a09aee39: Status 404 returned error can't find the container with id 3d0bf76bdd87a405c59bbe5ef7257f5d0e9718a3bd77269f38cb3200a09aee39 Mar 21 05:16:48 crc kubenswrapper[4775]: I0321 05:16:48.866965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" event={"ID":"5efe4255-484c-47d7-800a-4d0dbc5cecd9","Type":"ContainerStarted","Data":"3d0bf76bdd87a405c59bbe5ef7257f5d0e9718a3bd77269f38cb3200a09aee39"} Mar 21 05:16:49 crc kubenswrapper[4775]: I0321 05:16:49.878488 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" event={"ID":"5efe4255-484c-47d7-800a-4d0dbc5cecd9","Type":"ContainerStarted","Data":"80480f5e0585d0b76dfc565ed1e8ddada97469b8a016e2301998dae453d6703e"} Mar 21 05:16:49 crc kubenswrapper[4775]: I0321 05:16:49.908164 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" podStartSLOduration=3.119050994 podStartE2EDuration="3.908113626s" podCreationTimestamp="2026-03-21 05:16:46 +0000 UTC" firstStartedPulling="2026-03-21 05:16:47.879383674 +0000 UTC m=+1760.855847308" lastFinishedPulling="2026-03-21 05:16:48.668446316 +0000 UTC m=+1761.644909940" observedRunningTime="2026-03-21 05:16:49.89648076 +0000 UTC m=+1762.872944404" watchObservedRunningTime="2026-03-21 05:16:49.908113626 +0000 UTC m=+1762.884577250" Mar 21 05:16:51 crc kubenswrapper[4775]: I0321 05:16:51.661649 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:16:51 crc kubenswrapper[4775]: E0321 05:16:51.662516 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:17:02 crc kubenswrapper[4775]: I0321 05:17:02.662469 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:17:02 crc kubenswrapper[4775]: E0321 05:17:02.663872 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:17:15 crc kubenswrapper[4775]: I0321 05:17:15.661884 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:17:15 crc kubenswrapper[4775]: E0321 05:17:15.663048 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:17:27 crc kubenswrapper[4775]: I0321 05:17:27.670014 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:17:27 crc kubenswrapper[4775]: E0321 05:17:27.674039 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:17:34 crc kubenswrapper[4775]: I0321 05:17:34.067264 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-td76j"] Mar 21 05:17:34 crc kubenswrapper[4775]: I0321 05:17:34.080760 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-td76j"] Mar 21 05:17:35 crc kubenswrapper[4775]: I0321 05:17:35.040933 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jxmtf"] Mar 21 05:17:35 crc kubenswrapper[4775]: I0321 05:17:35.052652 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jxmtf"] Mar 21 05:17:35 crc kubenswrapper[4775]: I0321 05:17:35.674906 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716605f1-5111-4e7a-9591-18dfb5da1984" path="/var/lib/kubelet/pods/716605f1-5111-4e7a-9591-18dfb5da1984/volumes" Mar 21 05:17:35 crc kubenswrapper[4775]: I0321 05:17:35.675756 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf8b79b-6156-4aa5-a769-3a96408745c1" path="/var/lib/kubelet/pods/cdf8b79b-6156-4aa5-a769-3a96408745c1/volumes" Mar 21 05:17:40 crc kubenswrapper[4775]: I0321 05:17:40.662212 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:17:40 crc kubenswrapper[4775]: E0321 05:17:40.663732 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:17:45 crc kubenswrapper[4775]: I0321 05:17:45.425969 4775 scope.go:117] "RemoveContainer" containerID="36bd24e1b5566dbbfb969007cb73ea97d71cecb278d83fa7485e1f5061b3eb1d" Mar 21 05:17:45 crc kubenswrapper[4775]: I0321 05:17:45.469293 4775 scope.go:117] "RemoveContainer" containerID="e72ab47d319ea50893e1c6b3a16586fa25056fa334f81da8dd2632328d94d338" Mar 21 05:17:45 crc kubenswrapper[4775]: I0321 05:17:45.524644 4775 scope.go:117] "RemoveContainer" containerID="43def41a0cf1f318d43db6319e03e84d1a862d118a43d560e7c5e1d51f26dead" Mar 21 05:17:48 crc kubenswrapper[4775]: I0321 05:17:48.040772 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dh4pj"] Mar 21 05:17:48 crc kubenswrapper[4775]: I0321 05:17:48.060279 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wvjlb"] Mar 21 05:17:48 crc kubenswrapper[4775]: I0321 05:17:48.071576 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dh4pj"] Mar 21 05:17:48 crc kubenswrapper[4775]: I0321 05:17:48.080309 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wvjlb"] Mar 21 05:17:49 crc kubenswrapper[4775]: I0321 05:17:49.673326 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cd9d81f-8c8a-46f9-9943-42dc0b638bef" path="/var/lib/kubelet/pods/5cd9d81f-8c8a-46f9-9943-42dc0b638bef/volumes" Mar 21 05:17:49 crc kubenswrapper[4775]: I0321 05:17:49.674403 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9531f5f9-8f77-4882-b779-4210b1de81de" path="/var/lib/kubelet/pods/9531f5f9-8f77-4882-b779-4210b1de81de/volumes" Mar 21 05:17:50 crc kubenswrapper[4775]: I0321 05:17:50.041287 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-l8qqc"] Mar 21 05:17:50 crc kubenswrapper[4775]: I0321 05:17:50.051810 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-l8qqc"] Mar 21 05:17:51 crc kubenswrapper[4775]: I0321 05:17:51.673875 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37bb7e34-ac47-44f6-b18f-ef4ed78eea6a" path="/var/lib/kubelet/pods/37bb7e34-ac47-44f6-b18f-ef4ed78eea6a/volumes" Mar 21 05:17:53 crc kubenswrapper[4775]: I0321 05:17:53.662312 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:17:53 crc kubenswrapper[4775]: E0321 05:17:53.663577 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:17:55 crc kubenswrapper[4775]: I0321 05:17:55.048007 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9cmq2"] Mar 21 05:17:55 crc kubenswrapper[4775]: I0321 05:17:55.060181 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9cmq2"] Mar 21 05:17:55 crc kubenswrapper[4775]: I0321 05:17:55.672114 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1658991f-7a2b-4ce9-a240-a940385e0b8f" path="/var/lib/kubelet/pods/1658991f-7a2b-4ce9-a240-a940385e0b8f/volumes" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.179318 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567838-kc6hr"] Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.183044 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-kc6hr" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.208442 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.208815 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.209159 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.272882 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-kc6hr"] Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.334578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmn5q\" (UniqueName: \"kubernetes.io/projected/8ed18838-ce8f-499e-a043-886b7e878eb3-kube-api-access-vmn5q\") pod \"auto-csr-approver-29567838-kc6hr\" (UID: \"8ed18838-ce8f-499e-a043-886b7e878eb3\") " pod="openshift-infra/auto-csr-approver-29567838-kc6hr" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.437478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmn5q\" (UniqueName: \"kubernetes.io/projected/8ed18838-ce8f-499e-a043-886b7e878eb3-kube-api-access-vmn5q\") pod \"auto-csr-approver-29567838-kc6hr\" (UID: \"8ed18838-ce8f-499e-a043-886b7e878eb3\") " pod="openshift-infra/auto-csr-approver-29567838-kc6hr" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.461342 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmn5q\" (UniqueName: \"kubernetes.io/projected/8ed18838-ce8f-499e-a043-886b7e878eb3-kube-api-access-vmn5q\") pod \"auto-csr-approver-29567838-kc6hr\" (UID: \"8ed18838-ce8f-499e-a043-886b7e878eb3\") " pod="openshift-infra/auto-csr-approver-29567838-kc6hr" Mar 21 05:18:00 crc kubenswrapper[4775]: I0321 05:18:00.569309 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-kc6hr" Mar 21 05:18:01 crc kubenswrapper[4775]: I0321 05:18:01.100706 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-kc6hr"] Mar 21 05:18:01 crc kubenswrapper[4775]: I0321 05:18:01.735210 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-kc6hr" event={"ID":"8ed18838-ce8f-499e-a043-886b7e878eb3","Type":"ContainerStarted","Data":"91f1137d357579580cb3c8e9d35f7f77751d4d82d0d8668e688013f61047e11a"} Mar 21 05:18:04 crc kubenswrapper[4775]: I0321 05:18:04.772552 4775 generic.go:334] "Generic (PLEG): container finished" podID="8ed18838-ce8f-499e-a043-886b7e878eb3" containerID="f7a7cb2d632bd295102216dc1df2efdc817134357d27ae93440d6f95f32d6b6b" exitCode=0 Mar 21 05:18:04 crc kubenswrapper[4775]: I0321 05:18:04.773017 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-kc6hr" event={"ID":"8ed18838-ce8f-499e-a043-886b7e878eb3","Type":"ContainerDied","Data":"f7a7cb2d632bd295102216dc1df2efdc817134357d27ae93440d6f95f32d6b6b"} Mar 21 05:18:05 crc kubenswrapper[4775]: I0321 05:18:05.784338 4775 generic.go:334] "Generic (PLEG): container finished" podID="5efe4255-484c-47d7-800a-4d0dbc5cecd9" containerID="80480f5e0585d0b76dfc565ed1e8ddada97469b8a016e2301998dae453d6703e" exitCode=0 Mar 21 05:18:05 crc kubenswrapper[4775]: I0321 05:18:05.784850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" event={"ID":"5efe4255-484c-47d7-800a-4d0dbc5cecd9","Type":"ContainerDied","Data":"80480f5e0585d0b76dfc565ed1e8ddada97469b8a016e2301998dae453d6703e"} Mar 21 05:18:06 crc kubenswrapper[4775]: I0321 05:18:06.183481 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-kc6hr" Mar 21 05:18:06 crc kubenswrapper[4775]: I0321 05:18:06.280230 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmn5q\" (UniqueName: \"kubernetes.io/projected/8ed18838-ce8f-499e-a043-886b7e878eb3-kube-api-access-vmn5q\") pod \"8ed18838-ce8f-499e-a043-886b7e878eb3\" (UID: \"8ed18838-ce8f-499e-a043-886b7e878eb3\") " Mar 21 05:18:06 crc kubenswrapper[4775]: I0321 05:18:06.285904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed18838-ce8f-499e-a043-886b7e878eb3-kube-api-access-vmn5q" (OuterVolumeSpecName: "kube-api-access-vmn5q") pod "8ed18838-ce8f-499e-a043-886b7e878eb3" (UID: "8ed18838-ce8f-499e-a043-886b7e878eb3"). InnerVolumeSpecName "kube-api-access-vmn5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:06 crc kubenswrapper[4775]: I0321 05:18:06.382882 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmn5q\" (UniqueName: \"kubernetes.io/projected/8ed18838-ce8f-499e-a043-886b7e878eb3-kube-api-access-vmn5q\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:06 crc kubenswrapper[4775]: I0321 05:18:06.798705 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-kc6hr" Mar 21 05:18:06 crc kubenswrapper[4775]: I0321 05:18:06.798649 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-kc6hr" event={"ID":"8ed18838-ce8f-499e-a043-886b7e878eb3","Type":"ContainerDied","Data":"91f1137d357579580cb3c8e9d35f7f77751d4d82d0d8668e688013f61047e11a"} Mar 21 05:18:06 crc kubenswrapper[4775]: I0321 05:18:06.798850 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91f1137d357579580cb3c8e9d35f7f77751d4d82d0d8668e688013f61047e11a" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.274903 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-cm8v6"] Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.285781 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-cm8v6"] Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.313997 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.442948 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-ssh-key-openstack-edpm-ipam\") pod \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.443092 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-inventory\") pod \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.443461 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpgdj\" (UniqueName: \"kubernetes.io/projected/5efe4255-484c-47d7-800a-4d0dbc5cecd9-kube-api-access-qpgdj\") pod \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\" (UID: \"5efe4255-484c-47d7-800a-4d0dbc5cecd9\") " Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.449801 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5efe4255-484c-47d7-800a-4d0dbc5cecd9-kube-api-access-qpgdj" (OuterVolumeSpecName: "kube-api-access-qpgdj") pod "5efe4255-484c-47d7-800a-4d0dbc5cecd9" (UID: "5efe4255-484c-47d7-800a-4d0dbc5cecd9"). InnerVolumeSpecName "kube-api-access-qpgdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.474723 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5efe4255-484c-47d7-800a-4d0dbc5cecd9" (UID: "5efe4255-484c-47d7-800a-4d0dbc5cecd9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.480293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-inventory" (OuterVolumeSpecName: "inventory") pod "5efe4255-484c-47d7-800a-4d0dbc5cecd9" (UID: "5efe4255-484c-47d7-800a-4d0dbc5cecd9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.547062 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.547099 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5efe4255-484c-47d7-800a-4d0dbc5cecd9-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.547147 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpgdj\" (UniqueName: \"kubernetes.io/projected/5efe4255-484c-47d7-800a-4d0dbc5cecd9-kube-api-access-qpgdj\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.673901 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffbca22-6533-4ade-8f1f-eeeda82f159f" path="/var/lib/kubelet/pods/0ffbca22-6533-4ade-8f1f-eeeda82f159f/volumes" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.815349 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" event={"ID":"5efe4255-484c-47d7-800a-4d0dbc5cecd9","Type":"ContainerDied","Data":"3d0bf76bdd87a405c59bbe5ef7257f5d0e9718a3bd77269f38cb3200a09aee39"} Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.815400 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d0bf76bdd87a405c59bbe5ef7257f5d0e9718a3bd77269f38cb3200a09aee39" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.815431 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.924001 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7"] Mar 21 05:18:07 crc kubenswrapper[4775]: E0321 05:18:07.925174 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efe4255-484c-47d7-800a-4d0dbc5cecd9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.925195 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efe4255-484c-47d7-800a-4d0dbc5cecd9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:07 crc kubenswrapper[4775]: E0321 05:18:07.925217 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed18838-ce8f-499e-a043-886b7e878eb3" containerName="oc" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.925228 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed18838-ce8f-499e-a043-886b7e878eb3" containerName="oc" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.925540 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed18838-ce8f-499e-a043-886b7e878eb3" containerName="oc" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.925563 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efe4255-484c-47d7-800a-4d0dbc5cecd9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.926501 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.929008 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.929008 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.929917 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.934573 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:18:07 crc kubenswrapper[4775]: I0321 05:18:07.938005 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7"] Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.059333 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.059398 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.059454 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskrp\" (UniqueName: \"kubernetes.io/projected/268b27f0-a217-459e-9502-7b522ca6fe2c-kube-api-access-fskrp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.163439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.163582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.163782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskrp\" (UniqueName: \"kubernetes.io/projected/268b27f0-a217-459e-9502-7b522ca6fe2c-kube-api-access-fskrp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.169950 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.174881 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.183550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskrp\" (UniqueName: \"kubernetes.io/projected/268b27f0-a217-459e-9502-7b522ca6fe2c-kube-api-access-fskrp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mfft7\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.263849 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:08 crc kubenswrapper[4775]: I0321 05:18:08.662414 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:18:08 crc kubenswrapper[4775]: E0321 05:18:08.663264 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:18:09 crc kubenswrapper[4775]: I0321 05:18:09.026713 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7"] Mar 21 05:18:09 crc kubenswrapper[4775]: I0321 05:18:09.839948 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" event={"ID":"268b27f0-a217-459e-9502-7b522ca6fe2c","Type":"ContainerStarted","Data":"220eaeb65dcf0a6aab2fe24c811cc732c89d575f1943b110ba154279a1328887"} Mar 21 05:18:10 crc kubenswrapper[4775]: I0321 05:18:10.854178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" event={"ID":"268b27f0-a217-459e-9502-7b522ca6fe2c","Type":"ContainerStarted","Data":"74f1c56742415a3035f0eba080f9203972e8b87038befa94adb04c3bc1be5ffb"} Mar 21 05:18:10 crc kubenswrapper[4775]: I0321 05:18:10.881424 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" podStartSLOduration=3.410989358 podStartE2EDuration="3.881399908s" podCreationTimestamp="2026-03-21 05:18:07 +0000 UTC" firstStartedPulling="2026-03-21 05:18:09.036517779 +0000 UTC m=+1842.012981393" lastFinishedPulling="2026-03-21 05:18:09.506928319 +0000 UTC m=+1842.483391943" observedRunningTime="2026-03-21 05:18:10.874640018 +0000 UTC m=+1843.851103642" watchObservedRunningTime="2026-03-21 05:18:10.881399908 +0000 UTC m=+1843.857863532" Mar 21 05:18:14 crc kubenswrapper[4775]: I0321 05:18:14.919175 4775 generic.go:334] "Generic (PLEG): container finished" podID="268b27f0-a217-459e-9502-7b522ca6fe2c" containerID="74f1c56742415a3035f0eba080f9203972e8b87038befa94adb04c3bc1be5ffb" exitCode=0 Mar 21 05:18:14 crc kubenswrapper[4775]: I0321 05:18:14.919279 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" event={"ID":"268b27f0-a217-459e-9502-7b522ca6fe2c","Type":"ContainerDied","Data":"74f1c56742415a3035f0eba080f9203972e8b87038befa94adb04c3bc1be5ffb"} Mar 21 05:18:16 crc kubenswrapper[4775]: I0321 05:18:16.810286 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:16 crc kubenswrapper[4775]: I0321 05:18:16.996545 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fskrp\" (UniqueName: \"kubernetes.io/projected/268b27f0-a217-459e-9502-7b522ca6fe2c-kube-api-access-fskrp\") pod \"268b27f0-a217-459e-9502-7b522ca6fe2c\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " Mar 21 05:18:16 crc kubenswrapper[4775]: I0321 05:18:16.996970 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-inventory\") pod \"268b27f0-a217-459e-9502-7b522ca6fe2c\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " Mar 21 05:18:16 crc kubenswrapper[4775]: I0321 05:18:16.997140 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-ssh-key-openstack-edpm-ipam\") pod \"268b27f0-a217-459e-9502-7b522ca6fe2c\" (UID: \"268b27f0-a217-459e-9502-7b522ca6fe2c\") " Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.015628 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268b27f0-a217-459e-9502-7b522ca6fe2c-kube-api-access-fskrp" (OuterVolumeSpecName: "kube-api-access-fskrp") pod "268b27f0-a217-459e-9502-7b522ca6fe2c" (UID: "268b27f0-a217-459e-9502-7b522ca6fe2c"). InnerVolumeSpecName "kube-api-access-fskrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.046055 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2"] Mar 21 05:18:17 crc kubenswrapper[4775]: E0321 05:18:17.046608 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268b27f0-a217-459e-9502-7b522ca6fe2c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.046628 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="268b27f0-a217-459e-9502-7b522ca6fe2c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.046826 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="268b27f0-a217-459e-9502-7b522ca6fe2c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.047646 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.047866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "268b27f0-a217-459e-9502-7b522ca6fe2c" (UID: "268b27f0-a217-459e-9502-7b522ca6fe2c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.051377 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-inventory" (OuterVolumeSpecName: "inventory") pod "268b27f0-a217-459e-9502-7b522ca6fe2c" (UID: "268b27f0-a217-459e-9502-7b522ca6fe2c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.081418 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2"] Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.100076 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fskrp\" (UniqueName: \"kubernetes.io/projected/268b27f0-a217-459e-9502-7b522ca6fe2c-kube-api-access-fskrp\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.100135 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.100149 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268b27f0-a217-459e-9502-7b522ca6fe2c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.155467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" event={"ID":"268b27f0-a217-459e-9502-7b522ca6fe2c","Type":"ContainerDied","Data":"220eaeb65dcf0a6aab2fe24c811cc732c89d575f1943b110ba154279a1328887"} Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.155537 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="220eaeb65dcf0a6aab2fe24c811cc732c89d575f1943b110ba154279a1328887" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.155552 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mfft7" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.202794 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.203096 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.203336 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znk7t\" (UniqueName: \"kubernetes.io/projected/dbd64e65-be8d-42e7-a686-d5454932156d-kube-api-access-znk7t\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.306199 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znk7t\" (UniqueName: \"kubernetes.io/projected/dbd64e65-be8d-42e7-a686-d5454932156d-kube-api-access-znk7t\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.306285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.306452 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.311763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.311799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.325502 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znk7t\" (UniqueName: \"kubernetes.io/projected/dbd64e65-be8d-42e7-a686-d5454932156d-kube-api-access-znk7t\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kfrz2\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.463493 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:17 crc kubenswrapper[4775]: I0321 05:18:17.996558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2"] Mar 21 05:18:18 crc kubenswrapper[4775]: I0321 05:18:18.169381 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" event={"ID":"dbd64e65-be8d-42e7-a686-d5454932156d","Type":"ContainerStarted","Data":"051257f240552b19775c8d22bcf0e38507e6f1d72aceca09aa4be651f10da4e9"} Mar 21 05:18:19 crc kubenswrapper[4775]: I0321 05:18:19.184402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" event={"ID":"dbd64e65-be8d-42e7-a686-d5454932156d","Type":"ContainerStarted","Data":"799cfa377f0b8d825e50cec86644f9bddd2d5dcd591a4f502902d1ae672bddca"} Mar 21 05:18:19 crc kubenswrapper[4775]: I0321 05:18:19.211010 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" podStartSLOduration=1.675271773 podStartE2EDuration="2.210989029s" podCreationTimestamp="2026-03-21 05:18:17 +0000 UTC" firstStartedPulling="2026-03-21 05:18:18.018478044 +0000 UTC m=+1850.994941668" lastFinishedPulling="2026-03-21 05:18:18.5541953 +0000 UTC m=+1851.530658924" observedRunningTime="2026-03-21 05:18:19.209338413 +0000 UTC m=+1852.185802047" watchObservedRunningTime="2026-03-21 05:18:19.210989029 +0000 UTC m=+1852.187452653" Mar 21 05:18:23 crc kubenswrapper[4775]: I0321 05:18:23.662895 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:18:23 crc kubenswrapper[4775]: E0321 05:18:23.665017 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.055134 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-45fb-account-create-update-glct2"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.068026 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a38c-account-create-update-x226p"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.083963 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db3c-account-create-update-k29zp"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.094705 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-tmm94"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.103582 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zfz4w"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.112035 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-45fb-account-create-update-glct2"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.123404 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db3c-account-create-update-k29zp"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.136743 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-wxmkg"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.144493 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zfz4w"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.151930 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-tmm94"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.160754 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a38c-account-create-update-x226p"] Mar 21 05:18:30 crc kubenswrapper[4775]: I0321 05:18:30.169337 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-wxmkg"] Mar 21 05:18:31 crc kubenswrapper[4775]: I0321 05:18:31.675017 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="028d46f7-6f14-40f6-a6cf-77aeec59a99e" path="/var/lib/kubelet/pods/028d46f7-6f14-40f6-a6cf-77aeec59a99e/volumes" Mar 21 05:18:31 crc kubenswrapper[4775]: I0321 05:18:31.677806 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a5e145-5b67-4d02-9104-ade3e48888db" path="/var/lib/kubelet/pods/22a5e145-5b67-4d02-9104-ade3e48888db/volumes" Mar 21 05:18:31 crc kubenswrapper[4775]: I0321 05:18:31.678482 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330e8d60-7af3-42ed-a0d7-234264037d09" path="/var/lib/kubelet/pods/330e8d60-7af3-42ed-a0d7-234264037d09/volumes" Mar 21 05:18:31 crc kubenswrapper[4775]: I0321 05:18:31.679211 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e82da2-fdd4-4ead-8a9b-022e55a03690" path="/var/lib/kubelet/pods/87e82da2-fdd4-4ead-8a9b-022e55a03690/volumes" Mar 21 05:18:31 crc kubenswrapper[4775]: I0321 05:18:31.680490 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65ed673-344a-4064-ae90-c9d20964c648" path="/var/lib/kubelet/pods/b65ed673-344a-4064-ae90-c9d20964c648/volumes" Mar 21 05:18:31 crc kubenswrapper[4775]: I0321 05:18:31.681014 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc238c51-34cc-4cd6-bdf2-00e3747a9d67" path="/var/lib/kubelet/pods/bc238c51-34cc-4cd6-bdf2-00e3747a9d67/volumes" Mar 21 05:18:38 crc kubenswrapper[4775]: I0321 05:18:38.662765 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:18:38 crc kubenswrapper[4775]: E0321 05:18:38.664165 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.657621 4775 scope.go:117] "RemoveContainer" containerID="fdab6584b0a9bddc76fc370c865954a1758e24e006007d8b2953684194b46540" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.722295 4775 scope.go:117] "RemoveContainer" containerID="eb428452fe127829b5fdd3edea1e21f849f0fb951673f81da2334a3d27807ded" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.761203 4775 scope.go:117] "RemoveContainer" containerID="1a60b462497324b6ca2e30820dbc175100c3b70b7e0a4ec6225ceb18cce50eaa" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.821971 4775 scope.go:117] "RemoveContainer" containerID="647e5b30ad956a8a9a631808f860ad303cf32d4aa053654a5cdcf1129262af27" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.852876 4775 scope.go:117] "RemoveContainer" containerID="f761be7525081d117b389918983117196fdba3258fd2ecd1f48a78a479770d1c" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.914564 4775 scope.go:117] "RemoveContainer" containerID="796b087daa0094c51c318e24d71b74b2beb873149bcfc37bff7fc5e4bd3c8ed7" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.968560 4775 scope.go:117] "RemoveContainer" containerID="959f27ca1f8883c27a7d20345a9d58bad88607efdd52bc8596a92a7571fa819e" Mar 21 05:18:45 crc kubenswrapper[4775]: I0321 05:18:45.995104 4775 scope.go:117] "RemoveContainer" containerID="84b76b926432d603dd6d10be15e6cfc9e20ac65787b9fb2a30c3bdbd38a96e09" Mar 21 05:18:46 crc kubenswrapper[4775]: I0321 05:18:46.048385 4775 scope.go:117] "RemoveContainer" containerID="f19a3eb592d811463bd26fefd306c16b1ec6187913716e5585ce5b02489e2c0c" Mar 21 05:18:46 crc kubenswrapper[4775]: I0321 05:18:46.102796 4775 scope.go:117] "RemoveContainer" containerID="0ff1159b21db4525f027b0be154a99974ea3e42eac6bac07387f589a08d088ce" Mar 21 05:18:46 crc kubenswrapper[4775]: I0321 05:18:46.131304 4775 scope.go:117] "RemoveContainer" containerID="1fd0a588f9d34ae16dba50e92cfc4f254b48cad183d5ee54a522ba1c5c91886c" Mar 21 05:18:51 crc kubenswrapper[4775]: I0321 05:18:51.663180 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:18:51 crc kubenswrapper[4775]: E0321 05:18:51.664532 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:18:54 crc kubenswrapper[4775]: I0321 05:18:54.592343 4775 generic.go:334] "Generic (PLEG): container finished" podID="dbd64e65-be8d-42e7-a686-d5454932156d" containerID="799cfa377f0b8d825e50cec86644f9bddd2d5dcd591a4f502902d1ae672bddca" exitCode=0 Mar 21 05:18:54 crc kubenswrapper[4775]: I0321 05:18:54.592428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" event={"ID":"dbd64e65-be8d-42e7-a686-d5454932156d","Type":"ContainerDied","Data":"799cfa377f0b8d825e50cec86644f9bddd2d5dcd591a4f502902d1ae672bddca"} Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.059060 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gk78h"] Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.070329 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gk78h"] Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.088701 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.143414 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-inventory\") pod \"dbd64e65-be8d-42e7-a686-d5454932156d\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.143558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-ssh-key-openstack-edpm-ipam\") pod \"dbd64e65-be8d-42e7-a686-d5454932156d\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.143773 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znk7t\" (UniqueName: \"kubernetes.io/projected/dbd64e65-be8d-42e7-a686-d5454932156d-kube-api-access-znk7t\") pod \"dbd64e65-be8d-42e7-a686-d5454932156d\" (UID: \"dbd64e65-be8d-42e7-a686-d5454932156d\") " Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.152575 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd64e65-be8d-42e7-a686-d5454932156d-kube-api-access-znk7t" (OuterVolumeSpecName: "kube-api-access-znk7t") pod "dbd64e65-be8d-42e7-a686-d5454932156d" (UID: "dbd64e65-be8d-42e7-a686-d5454932156d"). InnerVolumeSpecName "kube-api-access-znk7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.180053 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dbd64e65-be8d-42e7-a686-d5454932156d" (UID: "dbd64e65-be8d-42e7-a686-d5454932156d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.182811 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-inventory" (OuterVolumeSpecName: "inventory") pod "dbd64e65-be8d-42e7-a686-d5454932156d" (UID: "dbd64e65-be8d-42e7-a686-d5454932156d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.246353 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.246426 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znk7t\" (UniqueName: \"kubernetes.io/projected/dbd64e65-be8d-42e7-a686-d5454932156d-kube-api-access-znk7t\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.246442 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dbd64e65-be8d-42e7-a686-d5454932156d-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.616166 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" event={"ID":"dbd64e65-be8d-42e7-a686-d5454932156d","Type":"ContainerDied","Data":"051257f240552b19775c8d22bcf0e38507e6f1d72aceca09aa4be651f10da4e9"} Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.616224 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051257f240552b19775c8d22bcf0e38507e6f1d72aceca09aa4be651f10da4e9" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.616250 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kfrz2" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.722787 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn"] Mar 21 05:18:56 crc kubenswrapper[4775]: E0321 05:18:56.723378 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd64e65-be8d-42e7-a686-d5454932156d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.723401 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd64e65-be8d-42e7-a686-d5454932156d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.723611 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd64e65-be8d-42e7-a686-d5454932156d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.724489 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.731736 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.732139 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.733403 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.735754 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.743763 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn"] Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.756360 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx9km\" (UniqueName: \"kubernetes.io/projected/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-kube-api-access-dx9km\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.757052 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.757136 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.860186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.860266 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.860339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9km\" (UniqueName: \"kubernetes.io/projected/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-kube-api-access-dx9km\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.867686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.867693 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:56 crc kubenswrapper[4775]: I0321 05:18:56.893941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9km\" (UniqueName: \"kubernetes.io/projected/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-kube-api-access-dx9km\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:57 crc kubenswrapper[4775]: I0321 05:18:57.045167 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:18:57 crc kubenswrapper[4775]: I0321 05:18:57.591026 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn"] Mar 21 05:18:57 crc kubenswrapper[4775]: I0321 05:18:57.628763 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" event={"ID":"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd","Type":"ContainerStarted","Data":"8ae2ec9e0825e6dd97cd32fd99b532a4daf6bd5ca6135d8ff0e1f5859b77b50f"} Mar 21 05:18:57 crc kubenswrapper[4775]: I0321 05:18:57.673610 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824625b1-30cc-42c5-ad83-5854770c2f46" path="/var/lib/kubelet/pods/824625b1-30cc-42c5-ad83-5854770c2f46/volumes" Mar 21 05:18:58 crc kubenswrapper[4775]: I0321 05:18:58.641512 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" event={"ID":"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd","Type":"ContainerStarted","Data":"0db68889aac768d03b3c79bf8ab7779d509c671b3324f27b94397e0842b27b48"} Mar 21 05:18:58 crc kubenswrapper[4775]: I0321 05:18:58.671674 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" podStartSLOduration=2.072325403 podStartE2EDuration="2.671638637s" podCreationTimestamp="2026-03-21 05:18:56 +0000 UTC" firstStartedPulling="2026-03-21 05:18:57.605439451 +0000 UTC m=+1890.581903075" lastFinishedPulling="2026-03-21 05:18:58.204752695 +0000 UTC m=+1891.181216309" observedRunningTime="2026-03-21 05:18:58.659873066 +0000 UTC m=+1891.636336720" watchObservedRunningTime="2026-03-21 05:18:58.671638637 +0000 UTC m=+1891.648102261" Mar 21 05:19:06 crc kubenswrapper[4775]: I0321 05:19:06.661507 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:19:07 crc kubenswrapper[4775]: I0321 05:19:07.742202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"820b5e7971de8f534813cc1fdb277aabd37bf9063cedc671ef2c92d0328150cb"} Mar 21 05:19:18 crc kubenswrapper[4775]: I0321 05:19:18.036699 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pckp8"] Mar 21 05:19:18 crc kubenswrapper[4775]: I0321 05:19:18.071171 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pckp8"] Mar 21 05:19:19 crc kubenswrapper[4775]: I0321 05:19:19.673718 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a407dd6-593f-4806-884b-8e031639d25d" path="/var/lib/kubelet/pods/5a407dd6-593f-4806-884b-8e031639d25d/volumes" Mar 21 05:19:21 crc kubenswrapper[4775]: I0321 05:19:21.050209 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vd764"] Mar 21 05:19:21 crc kubenswrapper[4775]: I0321 05:19:21.059773 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vd764"] Mar 21 05:19:21 crc kubenswrapper[4775]: I0321 05:19:21.675928 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb1dd6e9-8801-464e-952c-d345498b132a" path="/var/lib/kubelet/pods/fb1dd6e9-8801-464e-952c-d345498b132a/volumes" Mar 21 05:19:46 crc kubenswrapper[4775]: I0321 05:19:46.394575 4775 scope.go:117] "RemoveContainer" containerID="650b9597595a69528b159834c598c6d1c2eb6858daf6ed558c1a6a03d2d450dd" Mar 21 05:19:46 crc kubenswrapper[4775]: I0321 05:19:46.442836 4775 scope.go:117] "RemoveContainer" containerID="166965bd5533ce39880e58f576322a3fbe5ebbddcee7067e02f0a62bbadf6013" Mar 21 05:19:46 crc kubenswrapper[4775]: I0321 05:19:46.569495 4775 scope.go:117] "RemoveContainer" containerID="20d707fa7f70310d9cb80b150c04d331c2047a421ec95e9ba298af9cad00d19d" Mar 21 05:19:47 crc kubenswrapper[4775]: I0321 05:19:47.227868 4775 generic.go:334] "Generic (PLEG): container finished" podID="8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd" containerID="0db68889aac768d03b3c79bf8ab7779d509c671b3324f27b94397e0842b27b48" exitCode=0 Mar 21 05:19:47 crc kubenswrapper[4775]: I0321 05:19:47.227985 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" event={"ID":"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd","Type":"ContainerDied","Data":"0db68889aac768d03b3c79bf8ab7779d509c671b3324f27b94397e0842b27b48"} Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.740995 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.830300 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-ssh-key-openstack-edpm-ipam\") pod \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.830447 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx9km\" (UniqueName: \"kubernetes.io/projected/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-kube-api-access-dx9km\") pod \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.830498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-inventory\") pod \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\" (UID: \"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd\") " Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.838725 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-kube-api-access-dx9km" (OuterVolumeSpecName: "kube-api-access-dx9km") pod "8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd" (UID: "8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd"). InnerVolumeSpecName "kube-api-access-dx9km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.863967 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd" (UID: "8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.864394 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-inventory" (OuterVolumeSpecName: "inventory") pod "8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd" (UID: "8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.933865 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.933908 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx9km\" (UniqueName: \"kubernetes.io/projected/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-kube-api-access-dx9km\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:48 crc kubenswrapper[4775]: I0321 05:19:48.933923 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.251872 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" event={"ID":"8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd","Type":"ContainerDied","Data":"8ae2ec9e0825e6dd97cd32fd99b532a4daf6bd5ca6135d8ff0e1f5859b77b50f"} Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.251933 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ae2ec9e0825e6dd97cd32fd99b532a4daf6bd5ca6135d8ff0e1f5859b77b50f" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.251977 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.355105 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swg9k"] Mar 21 05:19:49 crc kubenswrapper[4775]: E0321 05:19:49.355660 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.355682 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.355976 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.356917 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.361691 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.361723 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.361701 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.361910 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.368470 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swg9k"] Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.447065 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxn7\" (UniqueName: \"kubernetes.io/projected/5f45d376-4f59-4584-a545-16d4ff066232-kube-api-access-mzxn7\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.447317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.447381 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.549707 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxn7\" (UniqueName: \"kubernetes.io/projected/5f45d376-4f59-4584-a545-16d4ff066232-kube-api-access-mzxn7\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.550306 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.550358 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.557983 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.557981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.571602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxn7\" (UniqueName: \"kubernetes.io/projected/5f45d376-4f59-4584-a545-16d4ff066232-kube-api-access-mzxn7\") pod \"ssh-known-hosts-edpm-deployment-swg9k\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:49 crc kubenswrapper[4775]: I0321 05:19:49.692434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:19:50 crc kubenswrapper[4775]: I0321 05:19:50.228722 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swg9k"] Mar 21 05:19:50 crc kubenswrapper[4775]: I0321 05:19:50.244286 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:19:50 crc kubenswrapper[4775]: I0321 05:19:50.263430 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" event={"ID":"5f45d376-4f59-4584-a545-16d4ff066232","Type":"ContainerStarted","Data":"c8515e9316e1445df316546fb379a9a67b9033ea56ac37c614c85f82c6e2e36b"} Mar 21 05:19:51 crc kubenswrapper[4775]: I0321 05:19:51.288202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" event={"ID":"5f45d376-4f59-4584-a545-16d4ff066232","Type":"ContainerStarted","Data":"cd682044f1ea6f41eb1b64cb63d1eba5cf3b6662c7cf831d0a19d6298440a380"} Mar 21 05:19:58 crc kubenswrapper[4775]: I0321 05:19:58.380254 4775 generic.go:334] "Generic (PLEG): container finished" podID="5f45d376-4f59-4584-a545-16d4ff066232" containerID="cd682044f1ea6f41eb1b64cb63d1eba5cf3b6662c7cf831d0a19d6298440a380" exitCode=0 Mar 21 05:19:58 crc kubenswrapper[4775]: I0321 05:19:58.380334 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" event={"ID":"5f45d376-4f59-4584-a545-16d4ff066232","Type":"ContainerDied","Data":"cd682044f1ea6f41eb1b64cb63d1eba5cf3b6662c7cf831d0a19d6298440a380"} Mar 21 05:19:59 crc kubenswrapper[4775]: I0321 05:19:59.885163 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.000110 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-inventory-0\") pod \"5f45d376-4f59-4584-a545-16d4ff066232\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.000363 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzxn7\" (UniqueName: \"kubernetes.io/projected/5f45d376-4f59-4584-a545-16d4ff066232-kube-api-access-mzxn7\") pod \"5f45d376-4f59-4584-a545-16d4ff066232\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.000536 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-ssh-key-openstack-edpm-ipam\") pod \"5f45d376-4f59-4584-a545-16d4ff066232\" (UID: \"5f45d376-4f59-4584-a545-16d4ff066232\") " Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.008250 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f45d376-4f59-4584-a545-16d4ff066232-kube-api-access-mzxn7" (OuterVolumeSpecName: "kube-api-access-mzxn7") pod "5f45d376-4f59-4584-a545-16d4ff066232" (UID: "5f45d376-4f59-4584-a545-16d4ff066232"). InnerVolumeSpecName "kube-api-access-mzxn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.032412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5f45d376-4f59-4584-a545-16d4ff066232" (UID: "5f45d376-4f59-4584-a545-16d4ff066232"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.036632 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5f45d376-4f59-4584-a545-16d4ff066232" (UID: "5f45d376-4f59-4584-a545-16d4ff066232"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.103389 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.103529 4775 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5f45d376-4f59-4584-a545-16d4ff066232-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.103544 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzxn7\" (UniqueName: \"kubernetes.io/projected/5f45d376-4f59-4584-a545-16d4ff066232-kube-api-access-mzxn7\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.165386 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567840-lm4bc"] Mar 21 05:20:00 crc kubenswrapper[4775]: E0321 05:20:00.166101 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f45d376-4f59-4584-a545-16d4ff066232" containerName="ssh-known-hosts-edpm-deployment" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.166141 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f45d376-4f59-4584-a545-16d4ff066232" containerName="ssh-known-hosts-edpm-deployment" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.166439 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f45d376-4f59-4584-a545-16d4ff066232" containerName="ssh-known-hosts-edpm-deployment" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.167891 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-lm4bc" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.175048 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.175075 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.175183 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.181624 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-lm4bc"] Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.307979 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2gp\" (UniqueName: \"kubernetes.io/projected/90536c4e-c461-4d2c-95dd-f08660cb2e69-kube-api-access-8q2gp\") pod \"auto-csr-approver-29567840-lm4bc\" (UID: \"90536c4e-c461-4d2c-95dd-f08660cb2e69\") " pod="openshift-infra/auto-csr-approver-29567840-lm4bc" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.403931 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" event={"ID":"5f45d376-4f59-4584-a545-16d4ff066232","Type":"ContainerDied","Data":"c8515e9316e1445df316546fb379a9a67b9033ea56ac37c614c85f82c6e2e36b"} Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.403985 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8515e9316e1445df316546fb379a9a67b9033ea56ac37c614c85f82c6e2e36b" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.404055 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swg9k" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.409921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2gp\" (UniqueName: \"kubernetes.io/projected/90536c4e-c461-4d2c-95dd-f08660cb2e69-kube-api-access-8q2gp\") pod \"auto-csr-approver-29567840-lm4bc\" (UID: \"90536c4e-c461-4d2c-95dd-f08660cb2e69\") " pod="openshift-infra/auto-csr-approver-29567840-lm4bc" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.446580 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2gp\" (UniqueName: \"kubernetes.io/projected/90536c4e-c461-4d2c-95dd-f08660cb2e69-kube-api-access-8q2gp\") pod \"auto-csr-approver-29567840-lm4bc\" (UID: \"90536c4e-c461-4d2c-95dd-f08660cb2e69\") " pod="openshift-infra/auto-csr-approver-29567840-lm4bc" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.485801 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52"] Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.487100 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.490456 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-lm4bc" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.490728 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.490805 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.494959 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.497058 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.525237 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52"] Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.615171 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.615264 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmhw\" (UniqueName: \"kubernetes.io/projected/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-kube-api-access-xxmhw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.615491 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.720056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.720136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmhw\" (UniqueName: \"kubernetes.io/projected/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-kube-api-access-xxmhw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.720551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.726064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.727943 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.740853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmhw\" (UniqueName: \"kubernetes.io/projected/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-kube-api-access-xxmhw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-wtk52\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:00 crc kubenswrapper[4775]: I0321 05:20:00.828020 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:01 crc kubenswrapper[4775]: I0321 05:20:01.056637 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-lm4bc"] Mar 21 05:20:01 crc kubenswrapper[4775]: I0321 05:20:01.432441 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-lm4bc" event={"ID":"90536c4e-c461-4d2c-95dd-f08660cb2e69","Type":"ContainerStarted","Data":"d8db51af6d44c3f83810d412d859fe0105e96ac93de4bd29c22806706169bf8e"} Mar 21 05:20:01 crc kubenswrapper[4775]: I0321 05:20:01.484053 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52"] Mar 21 05:20:01 crc kubenswrapper[4775]: W0321 05:20:01.486271 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cf3d7cc_425b_4d40_a26c_a88d2b210c0a.slice/crio-6e010efa4efc7a62ef4d3be3e4b05434c3faa994dfeeba4e2ae97f91893d553e WatchSource:0}: Error finding container 6e010efa4efc7a62ef4d3be3e4b05434c3faa994dfeeba4e2ae97f91893d553e: Status 404 returned error can't find the container with id 6e010efa4efc7a62ef4d3be3e4b05434c3faa994dfeeba4e2ae97f91893d553e Mar 21 05:20:02 crc kubenswrapper[4775]: I0321 05:20:02.450278 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" event={"ID":"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a","Type":"ContainerStarted","Data":"7d776ce17e88d2b02f97757ee994a3ce5c88d09fa2bf2dab0d0c6611a107f314"} Mar 21 05:20:02 crc kubenswrapper[4775]: I0321 05:20:02.450811 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" event={"ID":"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a","Type":"ContainerStarted","Data":"6e010efa4efc7a62ef4d3be3e4b05434c3faa994dfeeba4e2ae97f91893d553e"} Mar 21 05:20:02 crc kubenswrapper[4775]: I0321 05:20:02.482363 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" podStartSLOduration=1.931591224 podStartE2EDuration="2.482342183s" podCreationTimestamp="2026-03-21 05:20:00 +0000 UTC" firstStartedPulling="2026-03-21 05:20:01.48951521 +0000 UTC m=+1954.465978834" lastFinishedPulling="2026-03-21 05:20:02.040266169 +0000 UTC m=+1955.016729793" observedRunningTime="2026-03-21 05:20:02.47688896 +0000 UTC m=+1955.453352604" watchObservedRunningTime="2026-03-21 05:20:02.482342183 +0000 UTC m=+1955.458805807" Mar 21 05:20:03 crc kubenswrapper[4775]: I0321 05:20:03.470189 4775 generic.go:334] "Generic (PLEG): container finished" podID="90536c4e-c461-4d2c-95dd-f08660cb2e69" containerID="383196299bf8ab4c199e90d21197235aa0a2587594bef58700d15b41f3c4abec" exitCode=0 Mar 21 05:20:03 crc kubenswrapper[4775]: I0321 05:20:03.470313 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-lm4bc" event={"ID":"90536c4e-c461-4d2c-95dd-f08660cb2e69","Type":"ContainerDied","Data":"383196299bf8ab4c199e90d21197235aa0a2587594bef58700d15b41f3c4abec"} Mar 21 05:20:04 crc kubenswrapper[4775]: I0321 05:20:04.054457 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-s5ct6"] Mar 21 05:20:04 crc kubenswrapper[4775]: I0321 05:20:04.063242 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-s5ct6"] Mar 21 05:20:04 crc kubenswrapper[4775]: I0321 05:20:04.884251 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-lm4bc" Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.106832 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q2gp\" (UniqueName: \"kubernetes.io/projected/90536c4e-c461-4d2c-95dd-f08660cb2e69-kube-api-access-8q2gp\") pod \"90536c4e-c461-4d2c-95dd-f08660cb2e69\" (UID: \"90536c4e-c461-4d2c-95dd-f08660cb2e69\") " Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.115715 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90536c4e-c461-4d2c-95dd-f08660cb2e69-kube-api-access-8q2gp" (OuterVolumeSpecName: "kube-api-access-8q2gp") pod "90536c4e-c461-4d2c-95dd-f08660cb2e69" (UID: "90536c4e-c461-4d2c-95dd-f08660cb2e69"). InnerVolumeSpecName "kube-api-access-8q2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.210010 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q2gp\" (UniqueName: \"kubernetes.io/projected/90536c4e-c461-4d2c-95dd-f08660cb2e69-kube-api-access-8q2gp\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.499618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-lm4bc" event={"ID":"90536c4e-c461-4d2c-95dd-f08660cb2e69","Type":"ContainerDied","Data":"d8db51af6d44c3f83810d412d859fe0105e96ac93de4bd29c22806706169bf8e"} Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.499687 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8db51af6d44c3f83810d412d859fe0105e96ac93de4bd29c22806706169bf8e" Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.499769 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-lm4bc" Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.675589 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3172d76-da59-4dee-95df-6b74b7fb0033" path="/var/lib/kubelet/pods/f3172d76-da59-4dee-95df-6b74b7fb0033/volumes" Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.961488 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-bqpk6"] Mar 21 05:20:05 crc kubenswrapper[4775]: I0321 05:20:05.971214 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-bqpk6"] Mar 21 05:20:07 crc kubenswrapper[4775]: I0321 05:20:07.678819 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca04e19-ebb7-47c3-a588-e3dc932857bf" path="/var/lib/kubelet/pods/dca04e19-ebb7-47c3-a588-e3dc932857bf/volumes" Mar 21 05:20:11 crc kubenswrapper[4775]: I0321 05:20:11.615699 4775 generic.go:334] "Generic (PLEG): container finished" podID="8cf3d7cc-425b-4d40-a26c-a88d2b210c0a" containerID="7d776ce17e88d2b02f97757ee994a3ce5c88d09fa2bf2dab0d0c6611a107f314" exitCode=0 Mar 21 05:20:11 crc kubenswrapper[4775]: I0321 05:20:11.615803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" event={"ID":"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a","Type":"ContainerDied","Data":"7d776ce17e88d2b02f97757ee994a3ce5c88d09fa2bf2dab0d0c6611a107f314"} Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.145504 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.261136 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-inventory\") pod \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.261254 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxmhw\" (UniqueName: \"kubernetes.io/projected/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-kube-api-access-xxmhw\") pod \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.261694 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-ssh-key-openstack-edpm-ipam\") pod \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\" (UID: \"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a\") " Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.269453 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-kube-api-access-xxmhw" (OuterVolumeSpecName: "kube-api-access-xxmhw") pod "8cf3d7cc-425b-4d40-a26c-a88d2b210c0a" (UID: "8cf3d7cc-425b-4d40-a26c-a88d2b210c0a"). InnerVolumeSpecName "kube-api-access-xxmhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.297676 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8cf3d7cc-425b-4d40-a26c-a88d2b210c0a" (UID: "8cf3d7cc-425b-4d40-a26c-a88d2b210c0a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.298199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-inventory" (OuterVolumeSpecName: "inventory") pod "8cf3d7cc-425b-4d40-a26c-a88d2b210c0a" (UID: "8cf3d7cc-425b-4d40-a26c-a88d2b210c0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.364052 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.364109 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.364146 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxmhw\" (UniqueName: \"kubernetes.io/projected/8cf3d7cc-425b-4d40-a26c-a88d2b210c0a-kube-api-access-xxmhw\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.663567 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.733516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-wtk52" event={"ID":"8cf3d7cc-425b-4d40-a26c-a88d2b210c0a","Type":"ContainerDied","Data":"6e010efa4efc7a62ef4d3be3e4b05434c3faa994dfeeba4e2ae97f91893d553e"} Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.734260 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e010efa4efc7a62ef4d3be3e4b05434c3faa994dfeeba4e2ae97f91893d553e" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.749508 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6"] Mar 21 05:20:13 crc kubenswrapper[4775]: E0321 05:20:13.751185 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90536c4e-c461-4d2c-95dd-f08660cb2e69" containerName="oc" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.751205 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="90536c4e-c461-4d2c-95dd-f08660cb2e69" containerName="oc" Mar 21 05:20:13 crc kubenswrapper[4775]: E0321 05:20:13.751229 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf3d7cc-425b-4d40-a26c-a88d2b210c0a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.751237 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf3d7cc-425b-4d40-a26c-a88d2b210c0a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.751459 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="90536c4e-c461-4d2c-95dd-f08660cb2e69" containerName="oc" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.751470 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf3d7cc-425b-4d40-a26c-a88d2b210c0a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.752177 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.762233 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.762530 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.762678 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.762780 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.779045 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6"] Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.876212 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.876476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm6hn\" (UniqueName: \"kubernetes.io/projected/034f630d-d6d6-41f0-8df6-e5db37b778f3-kube-api-access-fm6hn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.876609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.978928 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm6hn\" (UniqueName: \"kubernetes.io/projected/034f630d-d6d6-41f0-8df6-e5db37b778f3-kube-api-access-fm6hn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.979025 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.979108 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.983945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:13 crc kubenswrapper[4775]: I0321 05:20:13.985293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:14 crc kubenswrapper[4775]: I0321 05:20:14.005461 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm6hn\" (UniqueName: \"kubernetes.io/projected/034f630d-d6d6-41f0-8df6-e5db37b778f3-kube-api-access-fm6hn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:14 crc kubenswrapper[4775]: I0321 05:20:14.084990 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:14 crc kubenswrapper[4775]: I0321 05:20:14.660292 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6"] Mar 21 05:20:14 crc kubenswrapper[4775]: I0321 05:20:14.699625 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" event={"ID":"034f630d-d6d6-41f0-8df6-e5db37b778f3","Type":"ContainerStarted","Data":"5f2a16a34435a3935460d73dfabdb86ed489cb9cf19fa06ddc502da14005e057"} Mar 21 05:20:17 crc kubenswrapper[4775]: I0321 05:20:17.758972 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" event={"ID":"034f630d-d6d6-41f0-8df6-e5db37b778f3","Type":"ContainerStarted","Data":"fe3a52e25338bc80c572fe45a80d5fa4e0c87be67de390c6641af69c0274c9cc"} Mar 21 05:20:17 crc kubenswrapper[4775]: I0321 05:20:17.782699 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" podStartSLOduration=2.717090851 podStartE2EDuration="4.782679084s" podCreationTimestamp="2026-03-21 05:20:13 +0000 UTC" firstStartedPulling="2026-03-21 05:20:14.675640231 +0000 UTC m=+1967.652103855" lastFinishedPulling="2026-03-21 05:20:16.741228464 +0000 UTC m=+1969.717692088" observedRunningTime="2026-03-21 05:20:17.775319847 +0000 UTC m=+1970.751783471" watchObservedRunningTime="2026-03-21 05:20:17.782679084 +0000 UTC m=+1970.759142708" Mar 21 05:20:27 crc kubenswrapper[4775]: I0321 05:20:27.854636 4775 generic.go:334] "Generic (PLEG): container finished" podID="034f630d-d6d6-41f0-8df6-e5db37b778f3" containerID="fe3a52e25338bc80c572fe45a80d5fa4e0c87be67de390c6641af69c0274c9cc" exitCode=0 Mar 21 05:20:27 crc kubenswrapper[4775]: I0321 05:20:27.854712 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" event={"ID":"034f630d-d6d6-41f0-8df6-e5db37b778f3","Type":"ContainerDied","Data":"fe3a52e25338bc80c572fe45a80d5fa4e0c87be67de390c6641af69c0274c9cc"} Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.343486 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.456696 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-inventory\") pod \"034f630d-d6d6-41f0-8df6-e5db37b778f3\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.456861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-ssh-key-openstack-edpm-ipam\") pod \"034f630d-d6d6-41f0-8df6-e5db37b778f3\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.457062 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm6hn\" (UniqueName: \"kubernetes.io/projected/034f630d-d6d6-41f0-8df6-e5db37b778f3-kube-api-access-fm6hn\") pod \"034f630d-d6d6-41f0-8df6-e5db37b778f3\" (UID: \"034f630d-d6d6-41f0-8df6-e5db37b778f3\") " Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.464405 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/034f630d-d6d6-41f0-8df6-e5db37b778f3-kube-api-access-fm6hn" (OuterVolumeSpecName: "kube-api-access-fm6hn") pod "034f630d-d6d6-41f0-8df6-e5db37b778f3" (UID: "034f630d-d6d6-41f0-8df6-e5db37b778f3"). InnerVolumeSpecName "kube-api-access-fm6hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.491986 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-inventory" (OuterVolumeSpecName: "inventory") pod "034f630d-d6d6-41f0-8df6-e5db37b778f3" (UID: "034f630d-d6d6-41f0-8df6-e5db37b778f3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.493965 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "034f630d-d6d6-41f0-8df6-e5db37b778f3" (UID: "034f630d-d6d6-41f0-8df6-e5db37b778f3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.560351 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm6hn\" (UniqueName: \"kubernetes.io/projected/034f630d-d6d6-41f0-8df6-e5db37b778f3-kube-api-access-fm6hn\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.560413 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.560425 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/034f630d-d6d6-41f0-8df6-e5db37b778f3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.892568 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" event={"ID":"034f630d-d6d6-41f0-8df6-e5db37b778f3","Type":"ContainerDied","Data":"5f2a16a34435a3935460d73dfabdb86ed489cb9cf19fa06ddc502da14005e057"} Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.892624 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f2a16a34435a3935460d73dfabdb86ed489cb9cf19fa06ddc502da14005e057" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.892653 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.982839 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f"] Mar 21 05:20:29 crc kubenswrapper[4775]: E0321 05:20:29.983413 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="034f630d-d6d6-41f0-8df6-e5db37b778f3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.983443 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="034f630d-d6d6-41f0-8df6-e5db37b778f3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.983684 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="034f630d-d6d6-41f0-8df6-e5db37b778f3" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.984731 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.990516 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.990826 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.994166 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.994442 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.995666 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.995763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.995985 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:20:29 crc kubenswrapper[4775]: I0321 05:20:29.998269 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.002353 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f"] Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.078415 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.078500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.078527 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.078557 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.078612 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.078831 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079310 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079537 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079658 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46m9\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-kube-api-access-z46m9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079714 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.079875 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182350 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182454 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182516 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182557 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182694 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182735 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z46m9\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-kube-api-access-z46m9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182836 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182874 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.182950 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.183012 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.183038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.183078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.187556 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.188586 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.188772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.189605 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.189637 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.189675 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.191075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.191943 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.192676 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.192774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.193479 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.200112 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.200960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.206036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46m9\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-kube-api-access-z46m9\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.303214 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.855901 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f"] Mar 21 05:20:30 crc kubenswrapper[4775]: W0321 05:20:30.870086 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8390751b_3911_4a24_a1a2_c3d1d10da875.slice/crio-210bdffc220ee9aa5fdb6ed382504df48708beccdf4e3e885144cf89e9272f6c WatchSource:0}: Error finding container 210bdffc220ee9aa5fdb6ed382504df48708beccdf4e3e885144cf89e9272f6c: Status 404 returned error can't find the container with id 210bdffc220ee9aa5fdb6ed382504df48708beccdf4e3e885144cf89e9272f6c Mar 21 05:20:30 crc kubenswrapper[4775]: I0321 05:20:30.903935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" event={"ID":"8390751b-3911-4a24-a1a2-c3d1d10da875","Type":"ContainerStarted","Data":"210bdffc220ee9aa5fdb6ed382504df48708beccdf4e3e885144cf89e9272f6c"} Mar 21 05:20:32 crc kubenswrapper[4775]: I0321 05:20:32.923759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" event={"ID":"8390751b-3911-4a24-a1a2-c3d1d10da875","Type":"ContainerStarted","Data":"fb4a533700f8e88d08d3b8894d316fb781c0daefbc69e4bdf3fb51c50ad22ac6"} Mar 21 05:20:32 crc kubenswrapper[4775]: I0321 05:20:32.953225 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" podStartSLOduration=3.148408717 podStartE2EDuration="3.953198276s" podCreationTimestamp="2026-03-21 05:20:29 +0000 UTC" firstStartedPulling="2026-03-21 05:20:30.873140877 +0000 UTC m=+1983.849604501" lastFinishedPulling="2026-03-21 05:20:31.677930436 +0000 UTC m=+1984.654394060" observedRunningTime="2026-03-21 05:20:32.941213949 +0000 UTC m=+1985.917677603" watchObservedRunningTime="2026-03-21 05:20:32.953198276 +0000 UTC m=+1985.929661900" Mar 21 05:20:46 crc kubenswrapper[4775]: I0321 05:20:46.721825 4775 scope.go:117] "RemoveContainer" containerID="a1c552cc0e8c345fb5a90c312288f579e557c98ae97907195961586729f792ec" Mar 21 05:20:46 crc kubenswrapper[4775]: I0321 05:20:46.841922 4775 scope.go:117] "RemoveContainer" containerID="244997576ccb04a6089eebd186997e7f409b2534428a82f578796006c7c674ed" Mar 21 05:21:07 crc kubenswrapper[4775]: I0321 05:21:07.256850 4775 generic.go:334] "Generic (PLEG): container finished" podID="8390751b-3911-4a24-a1a2-c3d1d10da875" containerID="fb4a533700f8e88d08d3b8894d316fb781c0daefbc69e4bdf3fb51c50ad22ac6" exitCode=0 Mar 21 05:21:07 crc kubenswrapper[4775]: I0321 05:21:07.256968 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" event={"ID":"8390751b-3911-4a24-a1a2-c3d1d10da875","Type":"ContainerDied","Data":"fb4a533700f8e88d08d3b8894d316fb781c0daefbc69e4bdf3fb51c50ad22ac6"} Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.778807 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.845868 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ssh-key-openstack-edpm-ipam\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.845922 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-repo-setup-combined-ca-bundle\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.845980 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-telemetry-combined-ca-bundle\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846019 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ovn-combined-ca-bundle\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846049 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-nova-combined-ca-bundle\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846088 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-libvirt-combined-ca-bundle\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846157 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z46m9\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-kube-api-access-z46m9\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846267 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846301 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-neutron-metadata-combined-ca-bundle\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-bootstrap-combined-ca-bundle\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846358 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-inventory\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846403 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.846431 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8390751b-3911-4a24-a1a2-c3d1d10da875\" (UID: \"8390751b-3911-4a24-a1a2-c3d1d10da875\") " Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.857484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-kube-api-access-z46m9" (OuterVolumeSpecName: "kube-api-access-z46m9") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "kube-api-access-z46m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.859822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.859909 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.860695 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.860741 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.860863 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.860949 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.861004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.861022 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.861568 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.862370 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.871476 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.889297 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-inventory" (OuterVolumeSpecName: "inventory") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.889763 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8390751b-3911-4a24-a1a2-c3d1d10da875" (UID: "8390751b-3911-4a24-a1a2-c3d1d10da875"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949191 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949244 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949258 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949268 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949280 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949299 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949313 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949324 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949336 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949345 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949355 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8390751b-3911-4a24-a1a2-c3d1d10da875-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949367 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949378 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z46m9\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-kube-api-access-z46m9\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:08 crc kubenswrapper[4775]: I0321 05:21:08.949388 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8390751b-3911-4a24-a1a2-c3d1d10da875-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.279325 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" event={"ID":"8390751b-3911-4a24-a1a2-c3d1d10da875","Type":"ContainerDied","Data":"210bdffc220ee9aa5fdb6ed382504df48708beccdf4e3e885144cf89e9272f6c"} Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.280033 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210bdffc220ee9aa5fdb6ed382504df48708beccdf4e3e885144cf89e9272f6c" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.279434 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.536647 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj"] Mar 21 05:21:09 crc kubenswrapper[4775]: E0321 05:21:09.537219 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8390751b-3911-4a24-a1a2-c3d1d10da875" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.537239 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8390751b-3911-4a24-a1a2-c3d1d10da875" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.537469 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8390751b-3911-4a24-a1a2-c3d1d10da875" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.540238 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.544019 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.549305 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.552814 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.553065 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.553199 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.555166 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj"] Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.690910 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.691015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.691065 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.691352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.691488 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8vp\" (UniqueName: \"kubernetes.io/projected/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-kube-api-access-ff8vp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.794263 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.794359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.794386 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.794560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.794647 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8vp\" (UniqueName: \"kubernetes.io/projected/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-kube-api-access-ff8vp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.795299 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.799857 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.800911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.805787 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.825252 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8vp\" (UniqueName: \"kubernetes.io/projected/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-kube-api-access-ff8vp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-45kkj\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:09 crc kubenswrapper[4775]: I0321 05:21:09.863296 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:21:10 crc kubenswrapper[4775]: I0321 05:21:10.442020 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj"] Mar 21 05:21:11 crc kubenswrapper[4775]: I0321 05:21:11.305369 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" event={"ID":"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d","Type":"ContainerStarted","Data":"91615bfc7851440e5b50d2e960a1c3c69c8e29ff7364666e05cee85e6021db4b"} Mar 21 05:21:13 crc kubenswrapper[4775]: I0321 05:21:13.330426 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" event={"ID":"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d","Type":"ContainerStarted","Data":"44f7e0cba4f9c13c945de5cfc1e82d38f03c6e9791b0edfa1200a4d76e980571"} Mar 21 05:21:13 crc kubenswrapper[4775]: I0321 05:21:13.357153 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" podStartSLOduration=2.5938170080000003 podStartE2EDuration="4.357093854s" podCreationTimestamp="2026-03-21 05:21:09 +0000 UTC" firstStartedPulling="2026-03-21 05:21:10.448626553 +0000 UTC m=+2023.425090177" lastFinishedPulling="2026-03-21 05:21:12.211903399 +0000 UTC m=+2025.188367023" observedRunningTime="2026-03-21 05:21:13.352169056 +0000 UTC m=+2026.328632690" watchObservedRunningTime="2026-03-21 05:21:13.357093854 +0000 UTC m=+2026.333557478" Mar 21 05:21:32 crc kubenswrapper[4775]: I0321 05:21:32.482266 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:21:32 crc kubenswrapper[4775]: I0321 05:21:32.482865 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.151497 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567842-zs7vw"] Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.153889 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-zs7vw" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.158541 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.158789 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.161497 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.164682 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-zs7vw"] Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.243941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mxh4\" (UniqueName: \"kubernetes.io/projected/616ac3d7-a66e-45e3-b9ca-257dbd29e212-kube-api-access-7mxh4\") pod \"auto-csr-approver-29567842-zs7vw\" (UID: \"616ac3d7-a66e-45e3-b9ca-257dbd29e212\") " pod="openshift-infra/auto-csr-approver-29567842-zs7vw" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.346031 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mxh4\" (UniqueName: \"kubernetes.io/projected/616ac3d7-a66e-45e3-b9ca-257dbd29e212-kube-api-access-7mxh4\") pod \"auto-csr-approver-29567842-zs7vw\" (UID: \"616ac3d7-a66e-45e3-b9ca-257dbd29e212\") " pod="openshift-infra/auto-csr-approver-29567842-zs7vw" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.368397 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mxh4\" (UniqueName: \"kubernetes.io/projected/616ac3d7-a66e-45e3-b9ca-257dbd29e212-kube-api-access-7mxh4\") pod \"auto-csr-approver-29567842-zs7vw\" (UID: \"616ac3d7-a66e-45e3-b9ca-257dbd29e212\") " pod="openshift-infra/auto-csr-approver-29567842-zs7vw" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.474262 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-zs7vw" Mar 21 05:22:00 crc kubenswrapper[4775]: I0321 05:22:00.930590 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-zs7vw"] Mar 21 05:22:01 crc kubenswrapper[4775]: I0321 05:22:01.812578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-zs7vw" event={"ID":"616ac3d7-a66e-45e3-b9ca-257dbd29e212","Type":"ContainerStarted","Data":"468537eb5adb84885c64524865c1e78dfcc317bb1fa504df59db45e3a6ec9df0"} Mar 21 05:22:02 crc kubenswrapper[4775]: I0321 05:22:02.482364 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:22:02 crc kubenswrapper[4775]: I0321 05:22:02.482766 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:22:02 crc kubenswrapper[4775]: I0321 05:22:02.824548 4775 generic.go:334] "Generic (PLEG): container finished" podID="616ac3d7-a66e-45e3-b9ca-257dbd29e212" containerID="a383e6d9b0bff1f3c71589e15f0f0c78443f3a1f70c9b149974bd8327eb1f252" exitCode=0 Mar 21 05:22:02 crc kubenswrapper[4775]: I0321 05:22:02.824595 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-zs7vw" event={"ID":"616ac3d7-a66e-45e3-b9ca-257dbd29e212","Type":"ContainerDied","Data":"a383e6d9b0bff1f3c71589e15f0f0c78443f3a1f70c9b149974bd8327eb1f252"} Mar 21 05:22:04 crc kubenswrapper[4775]: I0321 05:22:04.199894 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-zs7vw" Mar 21 05:22:04 crc kubenswrapper[4775]: I0321 05:22:04.348797 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mxh4\" (UniqueName: \"kubernetes.io/projected/616ac3d7-a66e-45e3-b9ca-257dbd29e212-kube-api-access-7mxh4\") pod \"616ac3d7-a66e-45e3-b9ca-257dbd29e212\" (UID: \"616ac3d7-a66e-45e3-b9ca-257dbd29e212\") " Mar 21 05:22:04 crc kubenswrapper[4775]: I0321 05:22:04.356083 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616ac3d7-a66e-45e3-b9ca-257dbd29e212-kube-api-access-7mxh4" (OuterVolumeSpecName: "kube-api-access-7mxh4") pod "616ac3d7-a66e-45e3-b9ca-257dbd29e212" (UID: "616ac3d7-a66e-45e3-b9ca-257dbd29e212"). InnerVolumeSpecName "kube-api-access-7mxh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:04 crc kubenswrapper[4775]: I0321 05:22:04.452978 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mxh4\" (UniqueName: \"kubernetes.io/projected/616ac3d7-a66e-45e3-b9ca-257dbd29e212-kube-api-access-7mxh4\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:04 crc kubenswrapper[4775]: I0321 05:22:04.843318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-zs7vw" event={"ID":"616ac3d7-a66e-45e3-b9ca-257dbd29e212","Type":"ContainerDied","Data":"468537eb5adb84885c64524865c1e78dfcc317bb1fa504df59db45e3a6ec9df0"} Mar 21 05:22:04 crc kubenswrapper[4775]: I0321 05:22:04.843696 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468537eb5adb84885c64524865c1e78dfcc317bb1fa504df59db45e3a6ec9df0" Mar 21 05:22:04 crc kubenswrapper[4775]: I0321 05:22:04.843371 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-zs7vw" Mar 21 05:22:05 crc kubenswrapper[4775]: I0321 05:22:05.288919 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-whhlh"] Mar 21 05:22:05 crc kubenswrapper[4775]: I0321 05:22:05.298088 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-whhlh"] Mar 21 05:22:05 crc kubenswrapper[4775]: I0321 05:22:05.673528 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc" path="/var/lib/kubelet/pods/4b08d38e-ec7f-4b0a-9f67-11a4b2b4d3fc/volumes" Mar 21 05:22:14 crc kubenswrapper[4775]: I0321 05:22:14.935168 4775 generic.go:334] "Generic (PLEG): container finished" podID="3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" containerID="44f7e0cba4f9c13c945de5cfc1e82d38f03c6e9791b0edfa1200a4d76e980571" exitCode=0 Mar 21 05:22:14 crc kubenswrapper[4775]: I0321 05:22:14.935729 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" event={"ID":"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d","Type":"ContainerDied","Data":"44f7e0cba4f9c13c945de5cfc1e82d38f03c6e9791b0edfa1200a4d76e980571"} Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.439294 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.502849 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff8vp\" (UniqueName: \"kubernetes.io/projected/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-kube-api-access-ff8vp\") pod \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.502941 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ssh-key-openstack-edpm-ipam\") pod \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.503288 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovn-combined-ca-bundle\") pod \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.503322 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovncontroller-config-0\") pod \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.503362 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-inventory\") pod \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\" (UID: \"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d\") " Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.517396 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" (UID: "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.526395 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-kube-api-access-ff8vp" (OuterVolumeSpecName: "kube-api-access-ff8vp") pod "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" (UID: "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d"). InnerVolumeSpecName "kube-api-access-ff8vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.529707 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" (UID: "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.533876 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-inventory" (OuterVolumeSpecName: "inventory") pod "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" (UID: "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.554222 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" (UID: "3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.606300 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.606354 4775 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.606370 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.606382 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff8vp\" (UniqueName: \"kubernetes.io/projected/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-kube-api-access-ff8vp\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.606393 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.959486 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" event={"ID":"3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d","Type":"ContainerDied","Data":"91615bfc7851440e5b50d2e960a1c3c69c8e29ff7364666e05cee85e6021db4b"} Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.959540 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91615bfc7851440e5b50d2e960a1c3c69c8e29ff7364666e05cee85e6021db4b" Mar 21 05:22:16 crc kubenswrapper[4775]: I0321 05:22:16.959582 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-45kkj" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.140599 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9"] Mar 21 05:22:17 crc kubenswrapper[4775]: E0321 05:22:17.141181 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616ac3d7-a66e-45e3-b9ca-257dbd29e212" containerName="oc" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.141207 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="616ac3d7-a66e-45e3-b9ca-257dbd29e212" containerName="oc" Mar 21 05:22:17 crc kubenswrapper[4775]: E0321 05:22:17.141238 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.141247 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.141465 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.141507 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="616ac3d7-a66e-45e3-b9ca-257dbd29e212" containerName="oc" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.142339 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.145256 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.145370 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.145399 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.145452 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.145565 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.145754 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.158106 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9"] Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.222352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.222396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6gcx\" (UniqueName: \"kubernetes.io/projected/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-kube-api-access-m6gcx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.222508 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.222531 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.222567 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.222600 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.324915 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.325025 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.325798 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.325935 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.326178 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.326215 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6gcx\" (UniqueName: \"kubernetes.io/projected/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-kube-api-access-m6gcx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.338382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.338601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.338637 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.338857 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.339510 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.351559 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6gcx\" (UniqueName: \"kubernetes.io/projected/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-kube-api-access-m6gcx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.461654 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:22:17 crc kubenswrapper[4775]: I0321 05:22:17.986577 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9"] Mar 21 05:22:18 crc kubenswrapper[4775]: I0321 05:22:18.983464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" event={"ID":"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70","Type":"ContainerStarted","Data":"a2c5ca368c5713780f99ace2b5deee2ccd41dd05d8fa3f6e617082e9fad7068e"} Mar 21 05:22:21 crc kubenswrapper[4775]: I0321 05:22:21.020187 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" event={"ID":"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70","Type":"ContainerStarted","Data":"b09ff33d04d4e01ab63f1e09911d9dd1b31d78a871761c9ff131b30c739eccb4"} Mar 21 05:22:21 crc kubenswrapper[4775]: I0321 05:22:21.052867 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" podStartSLOduration=2.212685062 podStartE2EDuration="4.052843338s" podCreationTimestamp="2026-03-21 05:22:17 +0000 UTC" firstStartedPulling="2026-03-21 05:22:17.99540236 +0000 UTC m=+2090.971865994" lastFinishedPulling="2026-03-21 05:22:19.835560636 +0000 UTC m=+2092.812024270" observedRunningTime="2026-03-21 05:22:21.037591209 +0000 UTC m=+2094.014054863" watchObservedRunningTime="2026-03-21 05:22:21.052843338 +0000 UTC m=+2094.029306962" Mar 21 05:22:32 crc kubenswrapper[4775]: I0321 05:22:32.482512 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:22:32 crc kubenswrapper[4775]: I0321 05:22:32.483081 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:22:32 crc kubenswrapper[4775]: I0321 05:22:32.483139 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:22:32 crc kubenswrapper[4775]: I0321 05:22:32.483636 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"820b5e7971de8f534813cc1fdb277aabd37bf9063cedc671ef2c92d0328150cb"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:22:32 crc kubenswrapper[4775]: I0321 05:22:32.483689 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://820b5e7971de8f534813cc1fdb277aabd37bf9063cedc671ef2c92d0328150cb" gracePeriod=600 Mar 21 05:22:33 crc kubenswrapper[4775]: I0321 05:22:33.143154 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="820b5e7971de8f534813cc1fdb277aabd37bf9063cedc671ef2c92d0328150cb" exitCode=0 Mar 21 05:22:33 crc kubenswrapper[4775]: I0321 05:22:33.143215 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"820b5e7971de8f534813cc1fdb277aabd37bf9063cedc671ef2c92d0328150cb"} Mar 21 05:22:33 crc kubenswrapper[4775]: I0321 05:22:33.143470 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804"} Mar 21 05:22:33 crc kubenswrapper[4775]: I0321 05:22:33.143490 4775 scope.go:117] "RemoveContainer" containerID="5c991c4c4614039503bd13fbe36151a2388f655ed79ef2fa30df26614cdbcaa4" Mar 21 05:22:46 crc kubenswrapper[4775]: I0321 05:22:46.964568 4775 scope.go:117] "RemoveContainer" containerID="f116b6d526cd4aaa616f201bcf5084943e1e56754ce61bd36026bd8732dd3bd1" Mar 21 05:22:55 crc kubenswrapper[4775]: I0321 05:22:55.950390 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtxhl"] Mar 21 05:22:55 crc kubenswrapper[4775]: I0321 05:22:55.954149 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:55 crc kubenswrapper[4775]: I0321 05:22:55.972283 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtxhl"] Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.098047 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-utilities\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.098288 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jvd\" (UniqueName: \"kubernetes.io/projected/d4871de6-5a5d-4c10-98d5-c73c21570947-kube-api-access-j2jvd\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.098335 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-catalog-content\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.200496 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jvd\" (UniqueName: \"kubernetes.io/projected/d4871de6-5a5d-4c10-98d5-c73c21570947-kube-api-access-j2jvd\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.200986 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-catalog-content\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.201278 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-utilities\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.201627 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-catalog-content\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.202146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-utilities\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.240687 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jvd\" (UniqueName: \"kubernetes.io/projected/d4871de6-5a5d-4c10-98d5-c73c21570947-kube-api-access-j2jvd\") pod \"redhat-operators-jtxhl\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.317750 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:22:56 crc kubenswrapper[4775]: I0321 05:22:56.882826 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtxhl"] Mar 21 05:22:57 crc kubenswrapper[4775]: I0321 05:22:57.376286 4775 generic.go:334] "Generic (PLEG): container finished" podID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerID="cd99f6278b5762bb26ce5bb130e6b5017494ab8ebb62ead08c30e6235d692bd9" exitCode=0 Mar 21 05:22:57 crc kubenswrapper[4775]: I0321 05:22:57.376364 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtxhl" event={"ID":"d4871de6-5a5d-4c10-98d5-c73c21570947","Type":"ContainerDied","Data":"cd99f6278b5762bb26ce5bb130e6b5017494ab8ebb62ead08c30e6235d692bd9"} Mar 21 05:22:57 crc kubenswrapper[4775]: I0321 05:22:57.376413 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtxhl" event={"ID":"d4871de6-5a5d-4c10-98d5-c73c21570947","Type":"ContainerStarted","Data":"133664e00450e69e13de7d7b6de59770930fe0b11d4ea3c3c34a4f85006bcffe"} Mar 21 05:22:59 crc kubenswrapper[4775]: I0321 05:22:59.398950 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtxhl" event={"ID":"d4871de6-5a5d-4c10-98d5-c73c21570947","Type":"ContainerStarted","Data":"441225cb122bf66cabf5c48f54264da9293c307b82c5b70551d4830941ce387b"} Mar 21 05:23:00 crc kubenswrapper[4775]: I0321 05:23:00.414423 4775 generic.go:334] "Generic (PLEG): container finished" podID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerID="441225cb122bf66cabf5c48f54264da9293c307b82c5b70551d4830941ce387b" exitCode=0 Mar 21 05:23:00 crc kubenswrapper[4775]: I0321 05:23:00.414498 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtxhl" event={"ID":"d4871de6-5a5d-4c10-98d5-c73c21570947","Type":"ContainerDied","Data":"441225cb122bf66cabf5c48f54264da9293c307b82c5b70551d4830941ce387b"} Mar 21 05:23:01 crc kubenswrapper[4775]: I0321 05:23:01.428482 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtxhl" event={"ID":"d4871de6-5a5d-4c10-98d5-c73c21570947","Type":"ContainerStarted","Data":"a85357de462c6ddc32de73140ccff74411e8b1a6bb88a57b428ef54e3561339b"} Mar 21 05:23:01 crc kubenswrapper[4775]: I0321 05:23:01.458775 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtxhl" podStartSLOduration=2.9898515679999997 podStartE2EDuration="6.458755721s" podCreationTimestamp="2026-03-21 05:22:55 +0000 UTC" firstStartedPulling="2026-03-21 05:22:57.379596397 +0000 UTC m=+2130.356060021" lastFinishedPulling="2026-03-21 05:23:00.84850056 +0000 UTC m=+2133.824964174" observedRunningTime="2026-03-21 05:23:01.450492009 +0000 UTC m=+2134.426955653" watchObservedRunningTime="2026-03-21 05:23:01.458755721 +0000 UTC m=+2134.435219345" Mar 21 05:23:06 crc kubenswrapper[4775]: I0321 05:23:06.318394 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:23:06 crc kubenswrapper[4775]: I0321 05:23:06.319452 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:23:07 crc kubenswrapper[4775]: I0321 05:23:07.376509 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtxhl" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="registry-server" probeResult="failure" output=< Mar 21 05:23:07 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:23:07 crc kubenswrapper[4775]: > Mar 21 05:23:09 crc kubenswrapper[4775]: I0321 05:23:09.508335 4775 generic.go:334] "Generic (PLEG): container finished" podID="29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" containerID="b09ff33d04d4e01ab63f1e09911d9dd1b31d78a871761c9ff131b30c739eccb4" exitCode=0 Mar 21 05:23:09 crc kubenswrapper[4775]: I0321 05:23:09.508426 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" event={"ID":"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70","Type":"ContainerDied","Data":"b09ff33d04d4e01ab63f1e09911d9dd1b31d78a871761c9ff131b30c739eccb4"} Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.024276 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.131419 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-ssh-key-openstack-edpm-ipam\") pod \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.131488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-nova-metadata-neutron-config-0\") pod \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.131572 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6gcx\" (UniqueName: \"kubernetes.io/projected/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-kube-api-access-m6gcx\") pod \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.131638 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.131681 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-inventory\") pod \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.131770 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-metadata-combined-ca-bundle\") pod \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\" (UID: \"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70\") " Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.138577 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-kube-api-access-m6gcx" (OuterVolumeSpecName: "kube-api-access-m6gcx") pod "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" (UID: "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70"). InnerVolumeSpecName "kube-api-access-m6gcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.149330 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" (UID: "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.166372 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" (UID: "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.168930 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" (UID: "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.172309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" (UID: "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.173886 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-inventory" (OuterVolumeSpecName: "inventory") pod "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" (UID: "29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.234259 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.234302 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.234313 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6gcx\" (UniqueName: \"kubernetes.io/projected/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-kube-api-access-m6gcx\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.234325 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.234337 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.234345 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.530944 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" event={"ID":"29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70","Type":"ContainerDied","Data":"a2c5ca368c5713780f99ace2b5deee2ccd41dd05d8fa3f6e617082e9fad7068e"} Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.531364 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2c5ca368c5713780f99ace2b5deee2ccd41dd05d8fa3f6e617082e9fad7068e" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.531442 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.653267 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz"] Mar 21 05:23:11 crc kubenswrapper[4775]: E0321 05:23:11.653911 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.653944 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.654252 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.655226 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.658677 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.658735 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.658767 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.659487 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.659814 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.675017 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz"] Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.747454 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.747598 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f844j\" (UniqueName: \"kubernetes.io/projected/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-kube-api-access-f844j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.747651 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.747689 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.747731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.849935 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.850094 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f844j\" (UniqueName: \"kubernetes.io/projected/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-kube-api-access-f844j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.850165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.850204 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.850244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.856228 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.857343 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.857609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.870870 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.872374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f844j\" (UniqueName: \"kubernetes.io/projected/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-kube-api-access-f844j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:11 crc kubenswrapper[4775]: I0321 05:23:11.978667 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:23:12 crc kubenswrapper[4775]: I0321 05:23:12.542303 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz"] Mar 21 05:23:13 crc kubenswrapper[4775]: I0321 05:23:13.552867 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" event={"ID":"a71dcc90-c70a-4ff8-bf4a-42f1a2415827","Type":"ContainerStarted","Data":"7a5a248239d97b47a9a8821636d4ceb0030545a6d3ce4d3885e064cebb074b6b"} Mar 21 05:23:14 crc kubenswrapper[4775]: I0321 05:23:14.566566 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" event={"ID":"a71dcc90-c70a-4ff8-bf4a-42f1a2415827","Type":"ContainerStarted","Data":"1e34cd56643c271492ee835c6208defe78742763b335409f4aab2ec44574dc78"} Mar 21 05:23:14 crc kubenswrapper[4775]: I0321 05:23:14.596877 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" podStartSLOduration=2.550810293 podStartE2EDuration="3.596841352s" podCreationTimestamp="2026-03-21 05:23:11 +0000 UTC" firstStartedPulling="2026-03-21 05:23:12.553615147 +0000 UTC m=+2145.530078771" lastFinishedPulling="2026-03-21 05:23:13.599646206 +0000 UTC m=+2146.576109830" observedRunningTime="2026-03-21 05:23:14.586056329 +0000 UTC m=+2147.562519993" watchObservedRunningTime="2026-03-21 05:23:14.596841352 +0000 UTC m=+2147.573305016" Mar 21 05:23:16 crc kubenswrapper[4775]: I0321 05:23:16.378277 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:23:16 crc kubenswrapper[4775]: I0321 05:23:16.433307 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:23:16 crc kubenswrapper[4775]: I0321 05:23:16.628055 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtxhl"] Mar 21 05:23:17 crc kubenswrapper[4775]: I0321 05:23:17.600621 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtxhl" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="registry-server" containerID="cri-o://a85357de462c6ddc32de73140ccff74411e8b1a6bb88a57b428ef54e3561339b" gracePeriod=2 Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.619490 4775 generic.go:334] "Generic (PLEG): container finished" podID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerID="a85357de462c6ddc32de73140ccff74411e8b1a6bb88a57b428ef54e3561339b" exitCode=0 Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.619552 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtxhl" event={"ID":"d4871de6-5a5d-4c10-98d5-c73c21570947","Type":"ContainerDied","Data":"a85357de462c6ddc32de73140ccff74411e8b1a6bb88a57b428ef54e3561339b"} Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.620342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtxhl" event={"ID":"d4871de6-5a5d-4c10-98d5-c73c21570947","Type":"ContainerDied","Data":"133664e00450e69e13de7d7b6de59770930fe0b11d4ea3c3c34a4f85006bcffe"} Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.620370 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133664e00450e69e13de7d7b6de59770930fe0b11d4ea3c3c34a4f85006bcffe" Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.626352 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.719403 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-utilities\") pod \"d4871de6-5a5d-4c10-98d5-c73c21570947\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.719686 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-catalog-content\") pod \"d4871de6-5a5d-4c10-98d5-c73c21570947\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.719730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2jvd\" (UniqueName: \"kubernetes.io/projected/d4871de6-5a5d-4c10-98d5-c73c21570947-kube-api-access-j2jvd\") pod \"d4871de6-5a5d-4c10-98d5-c73c21570947\" (UID: \"d4871de6-5a5d-4c10-98d5-c73c21570947\") " Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.720968 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-utilities" (OuterVolumeSpecName: "utilities") pod "d4871de6-5a5d-4c10-98d5-c73c21570947" (UID: "d4871de6-5a5d-4c10-98d5-c73c21570947"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.727672 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4871de6-5a5d-4c10-98d5-c73c21570947-kube-api-access-j2jvd" (OuterVolumeSpecName: "kube-api-access-j2jvd") pod "d4871de6-5a5d-4c10-98d5-c73c21570947" (UID: "d4871de6-5a5d-4c10-98d5-c73c21570947"). InnerVolumeSpecName "kube-api-access-j2jvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.822566 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.822615 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2jvd\" (UniqueName: \"kubernetes.io/projected/d4871de6-5a5d-4c10-98d5-c73c21570947-kube-api-access-j2jvd\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.853788 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4871de6-5a5d-4c10-98d5-c73c21570947" (UID: "d4871de6-5a5d-4c10-98d5-c73c21570947"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:18 crc kubenswrapper[4775]: I0321 05:23:18.924969 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4871de6-5a5d-4c10-98d5-c73c21570947-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:19 crc kubenswrapper[4775]: I0321 05:23:19.630785 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtxhl" Mar 21 05:23:19 crc kubenswrapper[4775]: I0321 05:23:19.677675 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtxhl"] Mar 21 05:23:19 crc kubenswrapper[4775]: I0321 05:23:19.684819 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtxhl"] Mar 21 05:23:21 crc kubenswrapper[4775]: I0321 05:23:21.711779 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" path="/var/lib/kubelet/pods/d4871de6-5a5d-4c10-98d5-c73c21570947/volumes" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.037201 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rzc5"] Mar 21 05:23:22 crc kubenswrapper[4775]: E0321 05:23:22.037836 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="extract-utilities" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.037858 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="extract-utilities" Mar 21 05:23:22 crc kubenswrapper[4775]: E0321 05:23:22.037915 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="extract-content" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.037924 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="extract-content" Mar 21 05:23:22 crc kubenswrapper[4775]: E0321 05:23:22.037939 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="registry-server" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.037947 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="registry-server" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.038186 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4871de6-5a5d-4c10-98d5-c73c21570947" containerName="registry-server" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.040063 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.053504 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rzc5"] Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.115617 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7xp\" (UniqueName: \"kubernetes.io/projected/f864469c-129d-4283-97c3-a5bf7c335a55-kube-api-access-bg7xp\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.115690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-utilities\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.115737 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-catalog-content\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.217814 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7xp\" (UniqueName: \"kubernetes.io/projected/f864469c-129d-4283-97c3-a5bf7c335a55-kube-api-access-bg7xp\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.218451 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-utilities\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.218526 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-catalog-content\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.219043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-utilities\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.219315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-catalog-content\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.240043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7xp\" (UniqueName: \"kubernetes.io/projected/f864469c-129d-4283-97c3-a5bf7c335a55-kube-api-access-bg7xp\") pod \"certified-operators-8rzc5\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:22 crc kubenswrapper[4775]: I0321 05:23:22.488810 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:23 crc kubenswrapper[4775]: I0321 05:23:23.041106 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rzc5"] Mar 21 05:23:23 crc kubenswrapper[4775]: I0321 05:23:23.709603 4775 generic.go:334] "Generic (PLEG): container finished" podID="f864469c-129d-4283-97c3-a5bf7c335a55" containerID="3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304" exitCode=0 Mar 21 05:23:23 crc kubenswrapper[4775]: I0321 05:23:23.710219 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rzc5" event={"ID":"f864469c-129d-4283-97c3-a5bf7c335a55","Type":"ContainerDied","Data":"3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304"} Mar 21 05:23:23 crc kubenswrapper[4775]: I0321 05:23:23.710253 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rzc5" event={"ID":"f864469c-129d-4283-97c3-a5bf7c335a55","Type":"ContainerStarted","Data":"eb54b40877e070b8198b71a3e3ca8585434f5e074af09bb5c6291db5937e329b"} Mar 21 05:23:25 crc kubenswrapper[4775]: I0321 05:23:25.731063 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rzc5" event={"ID":"f864469c-129d-4283-97c3-a5bf7c335a55","Type":"ContainerStarted","Data":"81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41"} Mar 21 05:23:26 crc kubenswrapper[4775]: I0321 05:23:26.742996 4775 generic.go:334] "Generic (PLEG): container finished" podID="f864469c-129d-4283-97c3-a5bf7c335a55" containerID="81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41" exitCode=0 Mar 21 05:23:26 crc kubenswrapper[4775]: I0321 05:23:26.743099 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rzc5" event={"ID":"f864469c-129d-4283-97c3-a5bf7c335a55","Type":"ContainerDied","Data":"81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41"} Mar 21 05:23:28 crc kubenswrapper[4775]: I0321 05:23:28.770812 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rzc5" event={"ID":"f864469c-129d-4283-97c3-a5bf7c335a55","Type":"ContainerStarted","Data":"7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a"} Mar 21 05:23:28 crc kubenswrapper[4775]: I0321 05:23:28.799953 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rzc5" podStartSLOduration=2.347730028 podStartE2EDuration="6.799924275s" podCreationTimestamp="2026-03-21 05:23:22 +0000 UTC" firstStartedPulling="2026-03-21 05:23:23.712998889 +0000 UTC m=+2156.689462513" lastFinishedPulling="2026-03-21 05:23:28.165193146 +0000 UTC m=+2161.141656760" observedRunningTime="2026-03-21 05:23:28.793550756 +0000 UTC m=+2161.770014430" watchObservedRunningTime="2026-03-21 05:23:28.799924275 +0000 UTC m=+2161.776387899" Mar 21 05:23:32 crc kubenswrapper[4775]: I0321 05:23:32.489167 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:32 crc kubenswrapper[4775]: I0321 05:23:32.489832 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:32 crc kubenswrapper[4775]: I0321 05:23:32.545414 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:42 crc kubenswrapper[4775]: I0321 05:23:42.543444 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:42 crc kubenswrapper[4775]: I0321 05:23:42.598980 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rzc5"] Mar 21 05:23:42 crc kubenswrapper[4775]: I0321 05:23:42.927475 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rzc5" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="registry-server" containerID="cri-o://7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a" gracePeriod=2 Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.398922 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.452611 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7xp\" (UniqueName: \"kubernetes.io/projected/f864469c-129d-4283-97c3-a5bf7c335a55-kube-api-access-bg7xp\") pod \"f864469c-129d-4283-97c3-a5bf7c335a55\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.452787 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-catalog-content\") pod \"f864469c-129d-4283-97c3-a5bf7c335a55\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.452836 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-utilities\") pod \"f864469c-129d-4283-97c3-a5bf7c335a55\" (UID: \"f864469c-129d-4283-97c3-a5bf7c335a55\") " Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.454028 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-utilities" (OuterVolumeSpecName: "utilities") pod "f864469c-129d-4283-97c3-a5bf7c335a55" (UID: "f864469c-129d-4283-97c3-a5bf7c335a55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.464321 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f864469c-129d-4283-97c3-a5bf7c335a55-kube-api-access-bg7xp" (OuterVolumeSpecName: "kube-api-access-bg7xp") pod "f864469c-129d-4283-97c3-a5bf7c335a55" (UID: "f864469c-129d-4283-97c3-a5bf7c335a55"). InnerVolumeSpecName "kube-api-access-bg7xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.513854 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f864469c-129d-4283-97c3-a5bf7c335a55" (UID: "f864469c-129d-4283-97c3-a5bf7c335a55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.555701 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.555744 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg7xp\" (UniqueName: \"kubernetes.io/projected/f864469c-129d-4283-97c3-a5bf7c335a55-kube-api-access-bg7xp\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.555762 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f864469c-129d-4283-97c3-a5bf7c335a55-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.952550 4775 generic.go:334] "Generic (PLEG): container finished" podID="f864469c-129d-4283-97c3-a5bf7c335a55" containerID="7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a" exitCode=0 Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.952658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rzc5" event={"ID":"f864469c-129d-4283-97c3-a5bf7c335a55","Type":"ContainerDied","Data":"7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a"} Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.952701 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rzc5" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.953203 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rzc5" event={"ID":"f864469c-129d-4283-97c3-a5bf7c335a55","Type":"ContainerDied","Data":"eb54b40877e070b8198b71a3e3ca8585434f5e074af09bb5c6291db5937e329b"} Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.953245 4775 scope.go:117] "RemoveContainer" containerID="7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.987294 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rzc5"] Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.987504 4775 scope.go:117] "RemoveContainer" containerID="81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41" Mar 21 05:23:43 crc kubenswrapper[4775]: I0321 05:23:43.997623 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rzc5"] Mar 21 05:23:44 crc kubenswrapper[4775]: I0321 05:23:44.019997 4775 scope.go:117] "RemoveContainer" containerID="3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304" Mar 21 05:23:44 crc kubenswrapper[4775]: I0321 05:23:44.074099 4775 scope.go:117] "RemoveContainer" containerID="7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a" Mar 21 05:23:44 crc kubenswrapper[4775]: E0321 05:23:44.074931 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a\": container with ID starting with 7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a not found: ID does not exist" containerID="7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a" Mar 21 05:23:44 crc kubenswrapper[4775]: I0321 05:23:44.074973 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a"} err="failed to get container status \"7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a\": rpc error: code = NotFound desc = could not find container \"7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a\": container with ID starting with 7e9592852ea8dface72034178e5cbac515817e1921dd9f93e7ce91f809b38e5a not found: ID does not exist" Mar 21 05:23:44 crc kubenswrapper[4775]: I0321 05:23:44.075002 4775 scope.go:117] "RemoveContainer" containerID="81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41" Mar 21 05:23:44 crc kubenswrapper[4775]: E0321 05:23:44.075426 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41\": container with ID starting with 81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41 not found: ID does not exist" containerID="81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41" Mar 21 05:23:44 crc kubenswrapper[4775]: I0321 05:23:44.075467 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41"} err="failed to get container status \"81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41\": rpc error: code = NotFound desc = could not find container \"81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41\": container with ID starting with 81aedb7b0f6632b1c0988935f00df406ff8ec25de082d813684693afdec87a41 not found: ID does not exist" Mar 21 05:23:44 crc kubenswrapper[4775]: I0321 05:23:44.075494 4775 scope.go:117] "RemoveContainer" containerID="3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304" Mar 21 05:23:44 crc kubenswrapper[4775]: E0321 05:23:44.076271 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304\": container with ID starting with 3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304 not found: ID does not exist" containerID="3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304" Mar 21 05:23:44 crc kubenswrapper[4775]: I0321 05:23:44.076309 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304"} err="failed to get container status \"3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304\": rpc error: code = NotFound desc = could not find container \"3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304\": container with ID starting with 3ddd3f273e635e03c0656239a822df1967d7a5fa543be70834322e9a3f10b304 not found: ID does not exist" Mar 21 05:23:45 crc kubenswrapper[4775]: I0321 05:23:45.674610 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" path="/var/lib/kubelet/pods/f864469c-129d-4283-97c3-a5bf7c335a55/volumes" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.151069 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567844-25mvs"] Mar 21 05:24:00 crc kubenswrapper[4775]: E0321 05:24:00.152380 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.152402 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4775]: E0321 05:24:00.152424 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="extract-content" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.152432 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="extract-content" Mar 21 05:24:00 crc kubenswrapper[4775]: E0321 05:24:00.152451 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="extract-utilities" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.152459 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="extract-utilities" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.152761 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f864469c-129d-4283-97c3-a5bf7c335a55" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.153862 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-25mvs" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.156589 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.156999 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.157852 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.164040 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-25mvs"] Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.255722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xrmw\" (UniqueName: \"kubernetes.io/projected/c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6-kube-api-access-2xrmw\") pod \"auto-csr-approver-29567844-25mvs\" (UID: \"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6\") " pod="openshift-infra/auto-csr-approver-29567844-25mvs" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.357661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xrmw\" (UniqueName: \"kubernetes.io/projected/c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6-kube-api-access-2xrmw\") pod \"auto-csr-approver-29567844-25mvs\" (UID: \"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6\") " pod="openshift-infra/auto-csr-approver-29567844-25mvs" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.387848 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xrmw\" (UniqueName: \"kubernetes.io/projected/c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6-kube-api-access-2xrmw\") pod \"auto-csr-approver-29567844-25mvs\" (UID: \"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6\") " pod="openshift-infra/auto-csr-approver-29567844-25mvs" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.478612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-25mvs" Mar 21 05:24:00 crc kubenswrapper[4775]: I0321 05:24:00.987991 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-25mvs"] Mar 21 05:24:01 crc kubenswrapper[4775]: I0321 05:24:01.123636 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-25mvs" event={"ID":"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6","Type":"ContainerStarted","Data":"3112a924be6ecbb9a71c82c7867b14424d440c7cfce364b4203e2701aca683fb"} Mar 21 05:24:03 crc kubenswrapper[4775]: I0321 05:24:03.142419 4775 generic.go:334] "Generic (PLEG): container finished" podID="c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6" containerID="5936961592e0ba9da2bfbec1bc18d0df5f38a71ab49e8c90edaa7dea4ccb02ee" exitCode=0 Mar 21 05:24:03 crc kubenswrapper[4775]: I0321 05:24:03.142523 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-25mvs" event={"ID":"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6","Type":"ContainerDied","Data":"5936961592e0ba9da2bfbec1bc18d0df5f38a71ab49e8c90edaa7dea4ccb02ee"} Mar 21 05:24:04 crc kubenswrapper[4775]: I0321 05:24:04.553469 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-25mvs" Mar 21 05:24:04 crc kubenswrapper[4775]: I0321 05:24:04.661579 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xrmw\" (UniqueName: \"kubernetes.io/projected/c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6-kube-api-access-2xrmw\") pod \"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6\" (UID: \"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6\") " Mar 21 05:24:04 crc kubenswrapper[4775]: I0321 05:24:04.668850 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6-kube-api-access-2xrmw" (OuterVolumeSpecName: "kube-api-access-2xrmw") pod "c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6" (UID: "c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6"). InnerVolumeSpecName "kube-api-access-2xrmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:24:04 crc kubenswrapper[4775]: I0321 05:24:04.764855 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xrmw\" (UniqueName: \"kubernetes.io/projected/c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6-kube-api-access-2xrmw\") on node \"crc\" DevicePath \"\"" Mar 21 05:24:05 crc kubenswrapper[4775]: I0321 05:24:05.163487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-25mvs" event={"ID":"c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6","Type":"ContainerDied","Data":"3112a924be6ecbb9a71c82c7867b14424d440c7cfce364b4203e2701aca683fb"} Mar 21 05:24:05 crc kubenswrapper[4775]: I0321 05:24:05.163850 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3112a924be6ecbb9a71c82c7867b14424d440c7cfce364b4203e2701aca683fb" Mar 21 05:24:05 crc kubenswrapper[4775]: I0321 05:24:05.163573 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-25mvs" Mar 21 05:24:05 crc kubenswrapper[4775]: I0321 05:24:05.626623 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-kc6hr"] Mar 21 05:24:05 crc kubenswrapper[4775]: I0321 05:24:05.635465 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-kc6hr"] Mar 21 05:24:05 crc kubenswrapper[4775]: I0321 05:24:05.675881 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed18838-ce8f-499e-a043-886b7e878eb3" path="/var/lib/kubelet/pods/8ed18838-ce8f-499e-a043-886b7e878eb3/volumes" Mar 21 05:24:32 crc kubenswrapper[4775]: I0321 05:24:32.482059 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:24:32 crc kubenswrapper[4775]: I0321 05:24:32.482720 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.492665 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-px5pg"] Mar 21 05:24:46 crc kubenswrapper[4775]: E0321 05:24:46.493620 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6" containerName="oc" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.493634 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6" containerName="oc" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.493809 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6" containerName="oc" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.495201 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.510474 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-px5pg"] Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.555708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-utilities\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.555852 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzk6x\" (UniqueName: \"kubernetes.io/projected/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-kube-api-access-kzk6x\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.555922 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-catalog-content\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.657804 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-catalog-content\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.657921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-utilities\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.657993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzk6x\" (UniqueName: \"kubernetes.io/projected/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-kube-api-access-kzk6x\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.658782 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-catalog-content\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.658831 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-utilities\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.683013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzk6x\" (UniqueName: \"kubernetes.io/projected/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-kube-api-access-kzk6x\") pod \"redhat-marketplace-px5pg\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:46 crc kubenswrapper[4775]: I0321 05:24:46.819365 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:47 crc kubenswrapper[4775]: I0321 05:24:47.083526 4775 scope.go:117] "RemoveContainer" containerID="f7a7cb2d632bd295102216dc1df2efdc817134357d27ae93440d6f95f32d6b6b" Mar 21 05:24:47 crc kubenswrapper[4775]: I0321 05:24:47.342497 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-px5pg"] Mar 21 05:24:47 crc kubenswrapper[4775]: I0321 05:24:47.636880 4775 generic.go:334] "Generic (PLEG): container finished" podID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerID="56e35510c790be074d09f5ee8f124d413aaf4e647a0fe13993e83a909111b979" exitCode=0 Mar 21 05:24:47 crc kubenswrapper[4775]: I0321 05:24:47.636930 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-px5pg" event={"ID":"aad02c72-91c8-4ce2-b69d-915d9b0af7ec","Type":"ContainerDied","Data":"56e35510c790be074d09f5ee8f124d413aaf4e647a0fe13993e83a909111b979"} Mar 21 05:24:47 crc kubenswrapper[4775]: I0321 05:24:47.638603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-px5pg" event={"ID":"aad02c72-91c8-4ce2-b69d-915d9b0af7ec","Type":"ContainerStarted","Data":"a5bd6b6850af17ae87be5c25d646f5af789ea57c2b5ed61ad6e28b7c8a3d545e"} Mar 21 05:24:48 crc kubenswrapper[4775]: I0321 05:24:48.651352 4775 generic.go:334] "Generic (PLEG): container finished" podID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerID="a2a8c8dc1d88f43d6ededf2e8793a512b5087dcbf7e313f245d34485e136258b" exitCode=0 Mar 21 05:24:48 crc kubenswrapper[4775]: I0321 05:24:48.651406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-px5pg" event={"ID":"aad02c72-91c8-4ce2-b69d-915d9b0af7ec","Type":"ContainerDied","Data":"a2a8c8dc1d88f43d6ededf2e8793a512b5087dcbf7e313f245d34485e136258b"} Mar 21 05:24:49 crc kubenswrapper[4775]: I0321 05:24:49.671618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-px5pg" event={"ID":"aad02c72-91c8-4ce2-b69d-915d9b0af7ec","Type":"ContainerStarted","Data":"60fb5be3131696012c97db115a74ff472115d80eb7fd9153e9102904c69c91b3"} Mar 21 05:24:49 crc kubenswrapper[4775]: I0321 05:24:49.694875 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-px5pg" podStartSLOduration=2.22205689 podStartE2EDuration="3.694850733s" podCreationTimestamp="2026-03-21 05:24:46 +0000 UTC" firstStartedPulling="2026-03-21 05:24:47.638734106 +0000 UTC m=+2240.615197730" lastFinishedPulling="2026-03-21 05:24:49.111527949 +0000 UTC m=+2242.087991573" observedRunningTime="2026-03-21 05:24:49.682845656 +0000 UTC m=+2242.659309290" watchObservedRunningTime="2026-03-21 05:24:49.694850733 +0000 UTC m=+2242.671314357" Mar 21 05:24:56 crc kubenswrapper[4775]: I0321 05:24:56.820216 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:56 crc kubenswrapper[4775]: I0321 05:24:56.822519 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:56 crc kubenswrapper[4775]: I0321 05:24:56.870223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:57 crc kubenswrapper[4775]: I0321 05:24:57.793264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:24:57 crc kubenswrapper[4775]: I0321 05:24:57.845612 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-px5pg"] Mar 21 05:24:59 crc kubenswrapper[4775]: I0321 05:24:59.752860 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-px5pg" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="registry-server" containerID="cri-o://60fb5be3131696012c97db115a74ff472115d80eb7fd9153e9102904c69c91b3" gracePeriod=2 Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.767923 4775 generic.go:334] "Generic (PLEG): container finished" podID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerID="60fb5be3131696012c97db115a74ff472115d80eb7fd9153e9102904c69c91b3" exitCode=0 Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.768173 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-px5pg" event={"ID":"aad02c72-91c8-4ce2-b69d-915d9b0af7ec","Type":"ContainerDied","Data":"60fb5be3131696012c97db115a74ff472115d80eb7fd9153e9102904c69c91b3"} Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.768458 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-px5pg" event={"ID":"aad02c72-91c8-4ce2-b69d-915d9b0af7ec","Type":"ContainerDied","Data":"a5bd6b6850af17ae87be5c25d646f5af789ea57c2b5ed61ad6e28b7c8a3d545e"} Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.768480 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5bd6b6850af17ae87be5c25d646f5af789ea57c2b5ed61ad6e28b7c8a3d545e" Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.783224 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.838733 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-utilities\") pod \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.838792 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-catalog-content\") pod \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.838843 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzk6x\" (UniqueName: \"kubernetes.io/projected/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-kube-api-access-kzk6x\") pod \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\" (UID: \"aad02c72-91c8-4ce2-b69d-915d9b0af7ec\") " Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.839752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-utilities" (OuterVolumeSpecName: "utilities") pod "aad02c72-91c8-4ce2-b69d-915d9b0af7ec" (UID: "aad02c72-91c8-4ce2-b69d-915d9b0af7ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.845828 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-kube-api-access-kzk6x" (OuterVolumeSpecName: "kube-api-access-kzk6x") pod "aad02c72-91c8-4ce2-b69d-915d9b0af7ec" (UID: "aad02c72-91c8-4ce2-b69d-915d9b0af7ec"). InnerVolumeSpecName "kube-api-access-kzk6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.869512 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aad02c72-91c8-4ce2-b69d-915d9b0af7ec" (UID: "aad02c72-91c8-4ce2-b69d-915d9b0af7ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.940914 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.940985 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:00 crc kubenswrapper[4775]: I0321 05:25:00.941008 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzk6x\" (UniqueName: \"kubernetes.io/projected/aad02c72-91c8-4ce2-b69d-915d9b0af7ec-kube-api-access-kzk6x\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:01 crc kubenswrapper[4775]: I0321 05:25:01.777228 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-px5pg" Mar 21 05:25:01 crc kubenswrapper[4775]: I0321 05:25:01.809603 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-px5pg"] Mar 21 05:25:01 crc kubenswrapper[4775]: I0321 05:25:01.819780 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-px5pg"] Mar 21 05:25:02 crc kubenswrapper[4775]: I0321 05:25:02.483372 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:25:02 crc kubenswrapper[4775]: I0321 05:25:02.483474 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:25:03 crc kubenswrapper[4775]: I0321 05:25:03.675104 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" path="/var/lib/kubelet/pods/aad02c72-91c8-4ce2-b69d-915d9b0af7ec/volumes" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.763051 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mvbdf"] Mar 21 05:25:25 crc kubenswrapper[4775]: E0321 05:25:25.764266 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="extract-utilities" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.764289 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="extract-utilities" Mar 21 05:25:25 crc kubenswrapper[4775]: E0321 05:25:25.764303 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="registry-server" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.764311 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="registry-server" Mar 21 05:25:25 crc kubenswrapper[4775]: E0321 05:25:25.764343 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="extract-content" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.764353 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="extract-content" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.764596 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad02c72-91c8-4ce2-b69d-915d9b0af7ec" containerName="registry-server" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.766305 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.776391 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvbdf"] Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.903950 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqrcn\" (UniqueName: \"kubernetes.io/projected/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-kube-api-access-dqrcn\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.904018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-utilities\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:25 crc kubenswrapper[4775]: I0321 05:25:25.904145 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-catalog-content\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.006194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-catalog-content\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.006535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqrcn\" (UniqueName: \"kubernetes.io/projected/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-kube-api-access-dqrcn\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.006587 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-utilities\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.007302 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-catalog-content\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.007668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-utilities\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.036439 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqrcn\" (UniqueName: \"kubernetes.io/projected/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-kube-api-access-dqrcn\") pod \"community-operators-mvbdf\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.090646 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:26 crc kubenswrapper[4775]: I0321 05:25:26.708196 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mvbdf"] Mar 21 05:25:27 crc kubenswrapper[4775]: I0321 05:25:27.050157 4775 generic.go:334] "Generic (PLEG): container finished" podID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerID="f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad" exitCode=0 Mar 21 05:25:27 crc kubenswrapper[4775]: I0321 05:25:27.050204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvbdf" event={"ID":"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f","Type":"ContainerDied","Data":"f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad"} Mar 21 05:25:27 crc kubenswrapper[4775]: I0321 05:25:27.050238 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvbdf" event={"ID":"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f","Type":"ContainerStarted","Data":"f2d2898b6bef0be3d1744eb1c72891e1071707d2629275e9a2f5d8e8d9a18417"} Mar 21 05:25:27 crc kubenswrapper[4775]: I0321 05:25:27.052720 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:25:28 crc kubenswrapper[4775]: I0321 05:25:28.063590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvbdf" event={"ID":"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f","Type":"ContainerStarted","Data":"755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d"} Mar 21 05:25:29 crc kubenswrapper[4775]: I0321 05:25:29.077567 4775 generic.go:334] "Generic (PLEG): container finished" podID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerID="755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d" exitCode=0 Mar 21 05:25:29 crc kubenswrapper[4775]: I0321 05:25:29.077655 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvbdf" event={"ID":"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f","Type":"ContainerDied","Data":"755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d"} Mar 21 05:25:30 crc kubenswrapper[4775]: I0321 05:25:30.093078 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvbdf" event={"ID":"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f","Type":"ContainerStarted","Data":"69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6"} Mar 21 05:25:30 crc kubenswrapper[4775]: I0321 05:25:30.120840 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mvbdf" podStartSLOduration=2.686509423 podStartE2EDuration="5.120815107s" podCreationTimestamp="2026-03-21 05:25:25 +0000 UTC" firstStartedPulling="2026-03-21 05:25:27.05236743 +0000 UTC m=+2280.028831054" lastFinishedPulling="2026-03-21 05:25:29.486673114 +0000 UTC m=+2282.463136738" observedRunningTime="2026-03-21 05:25:30.114764407 +0000 UTC m=+2283.091228051" watchObservedRunningTime="2026-03-21 05:25:30.120815107 +0000 UTC m=+2283.097278731" Mar 21 05:25:32 crc kubenswrapper[4775]: I0321 05:25:32.482313 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:25:32 crc kubenswrapper[4775]: I0321 05:25:32.482397 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:25:32 crc kubenswrapper[4775]: I0321 05:25:32.482452 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:25:32 crc kubenswrapper[4775]: I0321 05:25:32.483416 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:25:32 crc kubenswrapper[4775]: I0321 05:25:32.483488 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" gracePeriod=600 Mar 21 05:25:32 crc kubenswrapper[4775]: E0321 05:25:32.625732 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:25:33 crc kubenswrapper[4775]: I0321 05:25:33.123786 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" exitCode=0 Mar 21 05:25:33 crc kubenswrapper[4775]: I0321 05:25:33.123848 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804"} Mar 21 05:25:33 crc kubenswrapper[4775]: I0321 05:25:33.123899 4775 scope.go:117] "RemoveContainer" containerID="820b5e7971de8f534813cc1fdb277aabd37bf9063cedc671ef2c92d0328150cb" Mar 21 05:25:33 crc kubenswrapper[4775]: I0321 05:25:33.124814 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:25:33 crc kubenswrapper[4775]: E0321 05:25:33.125334 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:25:36 crc kubenswrapper[4775]: I0321 05:25:36.091183 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:36 crc kubenswrapper[4775]: I0321 05:25:36.091879 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:36 crc kubenswrapper[4775]: I0321 05:25:36.144241 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:36 crc kubenswrapper[4775]: I0321 05:25:36.226694 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:36 crc kubenswrapper[4775]: I0321 05:25:36.401601 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvbdf"] Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.177801 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mvbdf" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="registry-server" containerID="cri-o://69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6" gracePeriod=2 Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.669059 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.777034 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-catalog-content\") pod \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.777101 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-utilities\") pod \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.777286 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqrcn\" (UniqueName: \"kubernetes.io/projected/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-kube-api-access-dqrcn\") pod \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\" (UID: \"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f\") " Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.778631 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-utilities" (OuterVolumeSpecName: "utilities") pod "bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" (UID: "bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.779487 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.784506 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-kube-api-access-dqrcn" (OuterVolumeSpecName: "kube-api-access-dqrcn") pod "bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" (UID: "bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f"). InnerVolumeSpecName "kube-api-access-dqrcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:38 crc kubenswrapper[4775]: I0321 05:25:38.881339 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqrcn\" (UniqueName: \"kubernetes.io/projected/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-kube-api-access-dqrcn\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.192925 4775 generic.go:334] "Generic (PLEG): container finished" podID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerID="69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6" exitCode=0 Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.192979 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mvbdf" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.194214 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvbdf" event={"ID":"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f","Type":"ContainerDied","Data":"69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6"} Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.194328 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mvbdf" event={"ID":"bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f","Type":"ContainerDied","Data":"f2d2898b6bef0be3d1744eb1c72891e1071707d2629275e9a2f5d8e8d9a18417"} Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.194414 4775 scope.go:117] "RemoveContainer" containerID="69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.217541 4775 scope.go:117] "RemoveContainer" containerID="755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.241962 4775 scope.go:117] "RemoveContainer" containerID="f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.299297 4775 scope.go:117] "RemoveContainer" containerID="69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6" Mar 21 05:25:39 crc kubenswrapper[4775]: E0321 05:25:39.299890 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6\": container with ID starting with 69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6 not found: ID does not exist" containerID="69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.299945 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6"} err="failed to get container status \"69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6\": rpc error: code = NotFound desc = could not find container \"69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6\": container with ID starting with 69f21971bb6bff290af75da43f3b23e0347527e53e0d039c662e31e7666ba6a6 not found: ID does not exist" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.299978 4775 scope.go:117] "RemoveContainer" containerID="755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d" Mar 21 05:25:39 crc kubenswrapper[4775]: E0321 05:25:39.300322 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d\": container with ID starting with 755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d not found: ID does not exist" containerID="755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.300377 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d"} err="failed to get container status \"755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d\": rpc error: code = NotFound desc = could not find container \"755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d\": container with ID starting with 755d96b32d4b2250d80500cc147380b30160a4c2eb540c7f8ac9ac6c15730b9d not found: ID does not exist" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.300396 4775 scope.go:117] "RemoveContainer" containerID="f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad" Mar 21 05:25:39 crc kubenswrapper[4775]: E0321 05:25:39.300631 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad\": container with ID starting with f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad not found: ID does not exist" containerID="f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.300653 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad"} err="failed to get container status \"f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad\": rpc error: code = NotFound desc = could not find container \"f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad\": container with ID starting with f726a1456ad825ba50fc40daddce363b0cc5faa4387d4487cfa9ce395caedbad not found: ID does not exist" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.437220 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" (UID: "bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.494704 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.535184 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mvbdf"] Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.544581 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mvbdf"] Mar 21 05:25:39 crc kubenswrapper[4775]: I0321 05:25:39.675106 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" path="/var/lib/kubelet/pods/bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f/volumes" Mar 21 05:25:48 crc kubenswrapper[4775]: I0321 05:25:48.662468 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:25:48 crc kubenswrapper[4775]: E0321 05:25:48.663594 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.155392 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567846-6lslz"] Mar 21 05:26:00 crc kubenswrapper[4775]: E0321 05:26:00.156445 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="registry-server" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.156461 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="registry-server" Mar 21 05:26:00 crc kubenswrapper[4775]: E0321 05:26:00.156483 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="extract-utilities" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.156489 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="extract-utilities" Mar 21 05:26:00 crc kubenswrapper[4775]: E0321 05:26:00.156508 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="extract-content" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.156516 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="extract-content" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.156722 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad21dd2-8ec4-407c-a1d5-8c617dd4fc1f" containerName="registry-server" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.157415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-6lslz" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.163900 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.164184 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.164987 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.165830 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-6lslz"] Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.263965 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2wl\" (UniqueName: \"kubernetes.io/projected/43f89ba5-2605-4bdb-bc77-020c6bf9db75-kube-api-access-nr2wl\") pod \"auto-csr-approver-29567846-6lslz\" (UID: \"43f89ba5-2605-4bdb-bc77-020c6bf9db75\") " pod="openshift-infra/auto-csr-approver-29567846-6lslz" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.366308 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2wl\" (UniqueName: \"kubernetes.io/projected/43f89ba5-2605-4bdb-bc77-020c6bf9db75-kube-api-access-nr2wl\") pod \"auto-csr-approver-29567846-6lslz\" (UID: \"43f89ba5-2605-4bdb-bc77-020c6bf9db75\") " pod="openshift-infra/auto-csr-approver-29567846-6lslz" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.389431 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2wl\" (UniqueName: \"kubernetes.io/projected/43f89ba5-2605-4bdb-bc77-020c6bf9db75-kube-api-access-nr2wl\") pod \"auto-csr-approver-29567846-6lslz\" (UID: \"43f89ba5-2605-4bdb-bc77-020c6bf9db75\") " pod="openshift-infra/auto-csr-approver-29567846-6lslz" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.492786 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-6lslz" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.661429 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:26:00 crc kubenswrapper[4775]: E0321 05:26:00.662455 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:26:00 crc kubenswrapper[4775]: I0321 05:26:00.956061 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-6lslz"] Mar 21 05:26:01 crc kubenswrapper[4775]: I0321 05:26:01.405339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-6lslz" event={"ID":"43f89ba5-2605-4bdb-bc77-020c6bf9db75","Type":"ContainerStarted","Data":"a0cf3c13c79d3900fe284a3ffd7e6e9a637b796bfedf0ece83845a7f24aeae65"} Mar 21 05:26:02 crc kubenswrapper[4775]: I0321 05:26:02.418045 4775 generic.go:334] "Generic (PLEG): container finished" podID="43f89ba5-2605-4bdb-bc77-020c6bf9db75" containerID="a8931f9c5d17f0e3f84a26748f81ac4ac575c43d3a88da2c5e6613b2fee09eee" exitCode=0 Mar 21 05:26:02 crc kubenswrapper[4775]: I0321 05:26:02.418107 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-6lslz" event={"ID":"43f89ba5-2605-4bdb-bc77-020c6bf9db75","Type":"ContainerDied","Data":"a8931f9c5d17f0e3f84a26748f81ac4ac575c43d3a88da2c5e6613b2fee09eee"} Mar 21 05:26:03 crc kubenswrapper[4775]: I0321 05:26:03.808792 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-6lslz" Mar 21 05:26:03 crc kubenswrapper[4775]: I0321 05:26:03.942808 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr2wl\" (UniqueName: \"kubernetes.io/projected/43f89ba5-2605-4bdb-bc77-020c6bf9db75-kube-api-access-nr2wl\") pod \"43f89ba5-2605-4bdb-bc77-020c6bf9db75\" (UID: \"43f89ba5-2605-4bdb-bc77-020c6bf9db75\") " Mar 21 05:26:03 crc kubenswrapper[4775]: I0321 05:26:03.955286 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f89ba5-2605-4bdb-bc77-020c6bf9db75-kube-api-access-nr2wl" (OuterVolumeSpecName: "kube-api-access-nr2wl") pod "43f89ba5-2605-4bdb-bc77-020c6bf9db75" (UID: "43f89ba5-2605-4bdb-bc77-020c6bf9db75"). InnerVolumeSpecName "kube-api-access-nr2wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:26:04 crc kubenswrapper[4775]: I0321 05:26:04.046154 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr2wl\" (UniqueName: \"kubernetes.io/projected/43f89ba5-2605-4bdb-bc77-020c6bf9db75-kube-api-access-nr2wl\") on node \"crc\" DevicePath \"\"" Mar 21 05:26:04 crc kubenswrapper[4775]: I0321 05:26:04.438959 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-6lslz" event={"ID":"43f89ba5-2605-4bdb-bc77-020c6bf9db75","Type":"ContainerDied","Data":"a0cf3c13c79d3900fe284a3ffd7e6e9a637b796bfedf0ece83845a7f24aeae65"} Mar 21 05:26:04 crc kubenswrapper[4775]: I0321 05:26:04.439004 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0cf3c13c79d3900fe284a3ffd7e6e9a637b796bfedf0ece83845a7f24aeae65" Mar 21 05:26:04 crc kubenswrapper[4775]: I0321 05:26:04.438999 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-6lslz" Mar 21 05:26:04 crc kubenswrapper[4775]: I0321 05:26:04.888225 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-lm4bc"] Mar 21 05:26:04 crc kubenswrapper[4775]: I0321 05:26:04.898111 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-lm4bc"] Mar 21 05:26:05 crc kubenswrapper[4775]: I0321 05:26:05.673906 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90536c4e-c461-4d2c-95dd-f08660cb2e69" path="/var/lib/kubelet/pods/90536c4e-c461-4d2c-95dd-f08660cb2e69/volumes" Mar 21 05:26:11 crc kubenswrapper[4775]: I0321 05:26:11.662846 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:26:11 crc kubenswrapper[4775]: E0321 05:26:11.664298 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:26:26 crc kubenswrapper[4775]: I0321 05:26:26.661891 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:26:26 crc kubenswrapper[4775]: E0321 05:26:26.662809 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:26:40 crc kubenswrapper[4775]: I0321 05:26:40.662477 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:26:40 crc kubenswrapper[4775]: E0321 05:26:40.663321 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:26:47 crc kubenswrapper[4775]: I0321 05:26:47.239136 4775 scope.go:117] "RemoveContainer" containerID="383196299bf8ab4c199e90d21197235aa0a2587594bef58700d15b41f3c4abec" Mar 21 05:26:53 crc kubenswrapper[4775]: I0321 05:26:53.661721 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:26:53 crc kubenswrapper[4775]: E0321 05:26:53.662755 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:27:04 crc kubenswrapper[4775]: I0321 05:27:04.661788 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:27:04 crc kubenswrapper[4775]: E0321 05:27:04.662879 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:27:15 crc kubenswrapper[4775]: I0321 05:27:15.662876 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:27:15 crc kubenswrapper[4775]: E0321 05:27:15.663743 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:27:25 crc kubenswrapper[4775]: I0321 05:27:25.418913 4775 generic.go:334] "Generic (PLEG): container finished" podID="a71dcc90-c70a-4ff8-bf4a-42f1a2415827" containerID="1e34cd56643c271492ee835c6208defe78742763b335409f4aab2ec44574dc78" exitCode=0 Mar 21 05:27:25 crc kubenswrapper[4775]: I0321 05:27:25.419001 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" event={"ID":"a71dcc90-c70a-4ff8-bf4a-42f1a2415827","Type":"ContainerDied","Data":"1e34cd56643c271492ee835c6208defe78742763b335409f4aab2ec44574dc78"} Mar 21 05:27:26 crc kubenswrapper[4775]: I0321 05:27:26.981695 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.011192 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-combined-ca-bundle\") pod \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.011575 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f844j\" (UniqueName: \"kubernetes.io/projected/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-kube-api-access-f844j\") pod \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.011655 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-inventory\") pod \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.011814 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-ssh-key-openstack-edpm-ipam\") pod \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.011885 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-secret-0\") pod \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\" (UID: \"a71dcc90-c70a-4ff8-bf4a-42f1a2415827\") " Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.019544 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-kube-api-access-f844j" (OuterVolumeSpecName: "kube-api-access-f844j") pod "a71dcc90-c70a-4ff8-bf4a-42f1a2415827" (UID: "a71dcc90-c70a-4ff8-bf4a-42f1a2415827"). InnerVolumeSpecName "kube-api-access-f844j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.026502 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a71dcc90-c70a-4ff8-bf4a-42f1a2415827" (UID: "a71dcc90-c70a-4ff8-bf4a-42f1a2415827"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.049405 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-inventory" (OuterVolumeSpecName: "inventory") pod "a71dcc90-c70a-4ff8-bf4a-42f1a2415827" (UID: "a71dcc90-c70a-4ff8-bf4a-42f1a2415827"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.058971 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a71dcc90-c70a-4ff8-bf4a-42f1a2415827" (UID: "a71dcc90-c70a-4ff8-bf4a-42f1a2415827"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.069584 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a71dcc90-c70a-4ff8-bf4a-42f1a2415827" (UID: "a71dcc90-c70a-4ff8-bf4a-42f1a2415827"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.121157 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.121208 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.121217 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.121231 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f844j\" (UniqueName: \"kubernetes.io/projected/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-kube-api-access-f844j\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.121242 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a71dcc90-c70a-4ff8-bf4a-42f1a2415827-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.442193 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" event={"ID":"a71dcc90-c70a-4ff8-bf4a-42f1a2415827","Type":"ContainerDied","Data":"7a5a248239d97b47a9a8821636d4ceb0030545a6d3ce4d3885e064cebb074b6b"} Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.442260 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5a248239d97b47a9a8821636d4ceb0030545a6d3ce4d3885e064cebb074b6b" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.442368 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.565627 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7"] Mar 21 05:27:27 crc kubenswrapper[4775]: E0321 05:27:27.566367 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71dcc90-c70a-4ff8-bf4a-42f1a2415827" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.566390 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71dcc90-c70a-4ff8-bf4a-42f1a2415827" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:27 crc kubenswrapper[4775]: E0321 05:27:27.566408 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f89ba5-2605-4bdb-bc77-020c6bf9db75" containerName="oc" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.566415 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f89ba5-2605-4bdb-bc77-020c6bf9db75" containerName="oc" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.566665 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f89ba5-2605-4bdb-bc77-020c6bf9db75" containerName="oc" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.566683 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71dcc90-c70a-4ff8-bf4a-42f1a2415827" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.567718 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.570520 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.570656 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.570764 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.571025 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.571274 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.573312 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.576639 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.586885 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7"] Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.632754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.632823 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.632883 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50003f97-774c-4321-9ddf-6ac67546b19f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.632909 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.632938 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.632967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9d49\" (UniqueName: \"kubernetes.io/projected/50003f97-774c-4321-9ddf-6ac67546b19f-kube-api-access-t9d49\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.632991 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.633024 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.633058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.633085 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.633106 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.735820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.735912 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.735990 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736321 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50003f97-774c-4321-9ddf-6ac67546b19f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736425 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736510 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9d49\" (UniqueName: \"kubernetes.io/projected/50003f97-774c-4321-9ddf-6ac67546b19f-kube-api-access-t9d49\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.736586 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.742853 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.742908 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.742967 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.743011 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.743159 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.745191 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.754549 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.754721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.755650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50003f97-774c-4321-9ddf-6ac67546b19f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.756030 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.756547 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.757374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.761768 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.762197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.762719 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9d49\" (UniqueName: \"kubernetes.io/projected/50003f97-774c-4321-9ddf-6ac67546b19f-kube-api-access-t9d49\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.765753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-t85b7\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.889316 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:27:27 crc kubenswrapper[4775]: I0321 05:27:27.896691 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:27:28 crc kubenswrapper[4775]: I0321 05:27:28.463770 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7"] Mar 21 05:27:28 crc kubenswrapper[4775]: I0321 05:27:28.956869 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:27:29 crc kubenswrapper[4775]: I0321 05:27:29.465394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" event={"ID":"50003f97-774c-4321-9ddf-6ac67546b19f","Type":"ContainerStarted","Data":"1bd31fe06833c1b1b26bf4c74b0a110b0c9285cb71f3d43fb71c6f929a94bde5"} Mar 21 05:27:29 crc kubenswrapper[4775]: I0321 05:27:29.465750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" event={"ID":"50003f97-774c-4321-9ddf-6ac67546b19f","Type":"ContainerStarted","Data":"05426bdb669ddf8a7c54efd23e97575c7cd08628924364b8b1f0fca7b46f8ad1"} Mar 21 05:27:29 crc kubenswrapper[4775]: I0321 05:27:29.488367 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" podStartSLOduration=2.00543805 podStartE2EDuration="2.488344856s" podCreationTimestamp="2026-03-21 05:27:27 +0000 UTC" firstStartedPulling="2026-03-21 05:27:28.47030355 +0000 UTC m=+2401.446767174" lastFinishedPulling="2026-03-21 05:27:28.953210356 +0000 UTC m=+2401.929673980" observedRunningTime="2026-03-21 05:27:29.48241454 +0000 UTC m=+2402.458878164" watchObservedRunningTime="2026-03-21 05:27:29.488344856 +0000 UTC m=+2402.464808480" Mar 21 05:27:30 crc kubenswrapper[4775]: I0321 05:27:30.662057 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:27:30 crc kubenswrapper[4775]: E0321 05:27:30.662793 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:27:43 crc kubenswrapper[4775]: I0321 05:27:43.661724 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:27:43 crc kubenswrapper[4775]: E0321 05:27:43.662677 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:27:54 crc kubenswrapper[4775]: I0321 05:27:54.662247 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:27:54 crc kubenswrapper[4775]: E0321 05:27:54.663301 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.160289 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567848-6lggf"] Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.162436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-6lggf" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.165813 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.167526 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.168020 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.173555 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-6lggf"] Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.294092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wwl\" (UniqueName: \"kubernetes.io/projected/5d6c8144-5186-4a29-a208-b929f47b1695-kube-api-access-j6wwl\") pod \"auto-csr-approver-29567848-6lggf\" (UID: \"5d6c8144-5186-4a29-a208-b929f47b1695\") " pod="openshift-infra/auto-csr-approver-29567848-6lggf" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.397721 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wwl\" (UniqueName: \"kubernetes.io/projected/5d6c8144-5186-4a29-a208-b929f47b1695-kube-api-access-j6wwl\") pod \"auto-csr-approver-29567848-6lggf\" (UID: \"5d6c8144-5186-4a29-a208-b929f47b1695\") " pod="openshift-infra/auto-csr-approver-29567848-6lggf" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.426094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wwl\" (UniqueName: \"kubernetes.io/projected/5d6c8144-5186-4a29-a208-b929f47b1695-kube-api-access-j6wwl\") pod \"auto-csr-approver-29567848-6lggf\" (UID: \"5d6c8144-5186-4a29-a208-b929f47b1695\") " pod="openshift-infra/auto-csr-approver-29567848-6lggf" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.494017 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-6lggf" Mar 21 05:28:00 crc kubenswrapper[4775]: I0321 05:28:00.962802 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-6lggf"] Mar 21 05:28:01 crc kubenswrapper[4775]: I0321 05:28:01.803292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-6lggf" event={"ID":"5d6c8144-5186-4a29-a208-b929f47b1695","Type":"ContainerStarted","Data":"8461d8c1fa111103e9f9eae3a8d53c59bf4bf66b5c81f1cafcc6ae54ee1679aa"} Mar 21 05:28:02 crc kubenswrapper[4775]: I0321 05:28:02.827518 4775 generic.go:334] "Generic (PLEG): container finished" podID="5d6c8144-5186-4a29-a208-b929f47b1695" containerID="b0bde1a2f3bc4718dd2d5ab50594a32268b2724339820463f90a790bd3794b31" exitCode=0 Mar 21 05:28:02 crc kubenswrapper[4775]: I0321 05:28:02.828219 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-6lggf" event={"ID":"5d6c8144-5186-4a29-a208-b929f47b1695","Type":"ContainerDied","Data":"b0bde1a2f3bc4718dd2d5ab50594a32268b2724339820463f90a790bd3794b31"} Mar 21 05:28:04 crc kubenswrapper[4775]: I0321 05:28:04.147913 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-6lggf" Mar 21 05:28:04 crc kubenswrapper[4775]: I0321 05:28:04.291887 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6wwl\" (UniqueName: \"kubernetes.io/projected/5d6c8144-5186-4a29-a208-b929f47b1695-kube-api-access-j6wwl\") pod \"5d6c8144-5186-4a29-a208-b929f47b1695\" (UID: \"5d6c8144-5186-4a29-a208-b929f47b1695\") " Mar 21 05:28:04 crc kubenswrapper[4775]: I0321 05:28:04.303334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6c8144-5186-4a29-a208-b929f47b1695-kube-api-access-j6wwl" (OuterVolumeSpecName: "kube-api-access-j6wwl") pod "5d6c8144-5186-4a29-a208-b929f47b1695" (UID: "5d6c8144-5186-4a29-a208-b929f47b1695"). InnerVolumeSpecName "kube-api-access-j6wwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:04 crc kubenswrapper[4775]: I0321 05:28:04.395711 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6wwl\" (UniqueName: \"kubernetes.io/projected/5d6c8144-5186-4a29-a208-b929f47b1695-kube-api-access-j6wwl\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:04 crc kubenswrapper[4775]: I0321 05:28:04.851249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-6lggf" event={"ID":"5d6c8144-5186-4a29-a208-b929f47b1695","Type":"ContainerDied","Data":"8461d8c1fa111103e9f9eae3a8d53c59bf4bf66b5c81f1cafcc6ae54ee1679aa"} Mar 21 05:28:04 crc kubenswrapper[4775]: I0321 05:28:04.851297 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8461d8c1fa111103e9f9eae3a8d53c59bf4bf66b5c81f1cafcc6ae54ee1679aa" Mar 21 05:28:04 crc kubenswrapper[4775]: I0321 05:28:04.851355 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-6lggf" Mar 21 05:28:05 crc kubenswrapper[4775]: I0321 05:28:05.228967 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-zs7vw"] Mar 21 05:28:05 crc kubenswrapper[4775]: I0321 05:28:05.241197 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-zs7vw"] Mar 21 05:28:05 crc kubenswrapper[4775]: I0321 05:28:05.672456 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="616ac3d7-a66e-45e3-b9ca-257dbd29e212" path="/var/lib/kubelet/pods/616ac3d7-a66e-45e3-b9ca-257dbd29e212/volumes" Mar 21 05:28:07 crc kubenswrapper[4775]: I0321 05:28:07.670177 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:28:07 crc kubenswrapper[4775]: E0321 05:28:07.670661 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:28:20 crc kubenswrapper[4775]: I0321 05:28:20.662142 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:28:20 crc kubenswrapper[4775]: E0321 05:28:20.662876 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:28:35 crc kubenswrapper[4775]: I0321 05:28:35.662632 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:28:35 crc kubenswrapper[4775]: E0321 05:28:35.664406 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:28:47 crc kubenswrapper[4775]: I0321 05:28:47.350211 4775 scope.go:117] "RemoveContainer" containerID="a383e6d9b0bff1f3c71589e15f0f0c78443f3a1f70c9b149974bd8327eb1f252" Mar 21 05:28:50 crc kubenswrapper[4775]: I0321 05:28:50.662485 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:28:50 crc kubenswrapper[4775]: E0321 05:28:50.663338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:29:02 crc kubenswrapper[4775]: I0321 05:29:02.661947 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:29:02 crc kubenswrapper[4775]: E0321 05:29:02.663111 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:29:16 crc kubenswrapper[4775]: I0321 05:29:16.662090 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:29:16 crc kubenswrapper[4775]: E0321 05:29:16.663308 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:29:31 crc kubenswrapper[4775]: I0321 05:29:31.661917 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:29:31 crc kubenswrapper[4775]: E0321 05:29:31.662918 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:29:42 crc kubenswrapper[4775]: I0321 05:29:42.662973 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:29:42 crc kubenswrapper[4775]: E0321 05:29:42.663960 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:29:45 crc kubenswrapper[4775]: I0321 05:29:45.133823 4775 generic.go:334] "Generic (PLEG): container finished" podID="50003f97-774c-4321-9ddf-6ac67546b19f" containerID="1bd31fe06833c1b1b26bf4c74b0a110b0c9285cb71f3d43fb71c6f929a94bde5" exitCode=0 Mar 21 05:29:45 crc kubenswrapper[4775]: I0321 05:29:45.134319 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" event={"ID":"50003f97-774c-4321-9ddf-6ac67546b19f","Type":"ContainerDied","Data":"1bd31fe06833c1b1b26bf4c74b0a110b0c9285cb71f3d43fb71c6f929a94bde5"} Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.632371 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732442 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-0\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732522 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9d49\" (UniqueName: \"kubernetes.io/projected/50003f97-774c-4321-9ddf-6ac67546b19f-kube-api-access-t9d49\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-ssh-key-openstack-edpm-ipam\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732612 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-1\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732635 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-2\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732697 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-combined-ca-bundle\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732721 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50003f97-774c-4321-9ddf-6ac67546b19f-nova-extra-config-0\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732752 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-3\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.732858 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-1\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.733001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-0\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.733056 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-inventory\") pod \"50003f97-774c-4321-9ddf-6ac67546b19f\" (UID: \"50003f97-774c-4321-9ddf-6ac67546b19f\") " Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.750704 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50003f97-774c-4321-9ddf-6ac67546b19f-kube-api-access-t9d49" (OuterVolumeSpecName: "kube-api-access-t9d49") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "kube-api-access-t9d49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.756319 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.766988 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.770872 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.777497 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.779829 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.782434 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-inventory" (OuterVolumeSpecName: "inventory") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.782656 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.792756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50003f97-774c-4321-9ddf-6ac67546b19f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.804032 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.805994 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "50003f97-774c-4321-9ddf-6ac67546b19f" (UID: "50003f97-774c-4321-9ddf-6ac67546b19f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835705 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835756 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835771 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835780 4775 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50003f97-774c-4321-9ddf-6ac67546b19f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835794 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835804 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835813 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835825 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835837 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835848 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9d49\" (UniqueName: \"kubernetes.io/projected/50003f97-774c-4321-9ddf-6ac67546b19f-kube-api-access-t9d49\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:46 crc kubenswrapper[4775]: I0321 05:29:46.835860 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50003f97-774c-4321-9ddf-6ac67546b19f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.153174 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" event={"ID":"50003f97-774c-4321-9ddf-6ac67546b19f","Type":"ContainerDied","Data":"05426bdb669ddf8a7c54efd23e97575c7cd08628924364b8b1f0fca7b46f8ad1"} Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.153222 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05426bdb669ddf8a7c54efd23e97575c7cd08628924364b8b1f0fca7b46f8ad1" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.153221 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-t85b7" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.286537 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg"] Mar 21 05:29:47 crc kubenswrapper[4775]: E0321 05:29:47.286979 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6c8144-5186-4a29-a208-b929f47b1695" containerName="oc" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.286998 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6c8144-5186-4a29-a208-b929f47b1695" containerName="oc" Mar 21 05:29:47 crc kubenswrapper[4775]: E0321 05:29:47.287040 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50003f97-774c-4321-9ddf-6ac67546b19f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.287049 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="50003f97-774c-4321-9ddf-6ac67546b19f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.287411 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="50003f97-774c-4321-9ddf-6ac67546b19f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.287435 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6c8144-5186-4a29-a208-b929f47b1695" containerName="oc" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.288559 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.292994 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.293098 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.293016 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.293342 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.293555 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ljtt5" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.316307 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg"] Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.348597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.348650 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.348697 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.348932 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pskfk\" (UniqueName: \"kubernetes.io/projected/bccbefa9-966d-44b9-bd8f-bb566649b315-kube-api-access-pskfk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.349113 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.349374 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.349642 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.432518 4775 scope.go:117] "RemoveContainer" containerID="441225cb122bf66cabf5c48f54264da9293c307b82c5b70551d4830941ce387b" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.452601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.452720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pskfk\" (UniqueName: \"kubernetes.io/projected/bccbefa9-966d-44b9-bd8f-bb566649b315-kube-api-access-pskfk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.452789 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.452902 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.453042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.453157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.453202 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.460827 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.462608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.463223 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.464846 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.465196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.469194 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.485202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pskfk\" (UniqueName: \"kubernetes.io/projected/bccbefa9-966d-44b9-bd8f-bb566649b315-kube-api-access-pskfk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.486453 4775 scope.go:117] "RemoveContainer" containerID="a85357de462c6ddc32de73140ccff74411e8b1a6bb88a57b428ef54e3561339b" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.518805 4775 scope.go:117] "RemoveContainer" containerID="cd99f6278b5762bb26ce5bb130e6b5017494ab8ebb62ead08c30e6235d692bd9" Mar 21 05:29:47 crc kubenswrapper[4775]: I0321 05:29:47.615024 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:29:48 crc kubenswrapper[4775]: I0321 05:29:48.179704 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg"] Mar 21 05:29:49 crc kubenswrapper[4775]: I0321 05:29:49.174304 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" event={"ID":"bccbefa9-966d-44b9-bd8f-bb566649b315","Type":"ContainerStarted","Data":"8636db08b5425e5c464dc8f5aae692b68bdab14b9b01bcd67b6031e570796fa6"} Mar 21 05:29:50 crc kubenswrapper[4775]: I0321 05:29:50.186903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" event={"ID":"bccbefa9-966d-44b9-bd8f-bb566649b315","Type":"ContainerStarted","Data":"f29e85380399c91a14fac8a091af8cf23a0384a224751b0b587f895a3d44aec1"} Mar 21 05:29:50 crc kubenswrapper[4775]: I0321 05:29:50.213348 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" podStartSLOduration=1.999674274 podStartE2EDuration="3.213326297s" podCreationTimestamp="2026-03-21 05:29:47 +0000 UTC" firstStartedPulling="2026-03-21 05:29:48.179520399 +0000 UTC m=+2541.155984023" lastFinishedPulling="2026-03-21 05:29:49.393172412 +0000 UTC m=+2542.369636046" observedRunningTime="2026-03-21 05:29:50.21237066 +0000 UTC m=+2543.188834284" watchObservedRunningTime="2026-03-21 05:29:50.213326297 +0000 UTC m=+2543.189789911" Mar 21 05:29:55 crc kubenswrapper[4775]: I0321 05:29:55.662212 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:29:55 crc kubenswrapper[4775]: E0321 05:29:55.663488 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.163780 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567850-gfbv5"] Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.166372 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-gfbv5" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.168964 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.172181 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.172331 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.176420 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9"] Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.178995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.181288 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.181695 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.184430 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bf67\" (UniqueName: \"kubernetes.io/projected/2616ee6e-f4b6-4258-b738-95ad6aab1644-kube-api-access-7bf67\") pod \"auto-csr-approver-29567850-gfbv5\" (UID: \"2616ee6e-f4b6-4258-b738-95ad6aab1644\") " pod="openshift-infra/auto-csr-approver-29567850-gfbv5" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.187911 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9"] Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.200060 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-gfbv5"] Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.286283 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-secret-volume\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.286367 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bf67\" (UniqueName: \"kubernetes.io/projected/2616ee6e-f4b6-4258-b738-95ad6aab1644-kube-api-access-7bf67\") pod \"auto-csr-approver-29567850-gfbv5\" (UID: \"2616ee6e-f4b6-4258-b738-95ad6aab1644\") " pod="openshift-infra/auto-csr-approver-29567850-gfbv5" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.286395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-config-volume\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.286473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrsmx\" (UniqueName: \"kubernetes.io/projected/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-kube-api-access-qrsmx\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.307238 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bf67\" (UniqueName: \"kubernetes.io/projected/2616ee6e-f4b6-4258-b738-95ad6aab1644-kube-api-access-7bf67\") pod \"auto-csr-approver-29567850-gfbv5\" (UID: \"2616ee6e-f4b6-4258-b738-95ad6aab1644\") " pod="openshift-infra/auto-csr-approver-29567850-gfbv5" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.388815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-secret-volume\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.389273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-config-volume\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.389327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrsmx\" (UniqueName: \"kubernetes.io/projected/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-kube-api-access-qrsmx\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.391163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-config-volume\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.395778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-secret-volume\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.412668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrsmx\" (UniqueName: \"kubernetes.io/projected/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-kube-api-access-qrsmx\") pod \"collect-profiles-29567850-mp2j9\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.493977 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-gfbv5" Mar 21 05:30:00 crc kubenswrapper[4775]: I0321 05:30:00.505736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:01 crc kubenswrapper[4775]: I0321 05:30:01.019289 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-gfbv5"] Mar 21 05:30:01 crc kubenswrapper[4775]: I0321 05:30:01.119640 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9"] Mar 21 05:30:01 crc kubenswrapper[4775]: W0321 05:30:01.122800 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3215b5b_802c_4a03_b546_8e3ebe5f60b2.slice/crio-d90243f8615cf1c263b1b17b7bb8bda9e62e280a115b1f6d61465f1d0981e792 WatchSource:0}: Error finding container d90243f8615cf1c263b1b17b7bb8bda9e62e280a115b1f6d61465f1d0981e792: Status 404 returned error can't find the container with id d90243f8615cf1c263b1b17b7bb8bda9e62e280a115b1f6d61465f1d0981e792 Mar 21 05:30:01 crc kubenswrapper[4775]: I0321 05:30:01.350916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-gfbv5" event={"ID":"2616ee6e-f4b6-4258-b738-95ad6aab1644","Type":"ContainerStarted","Data":"5d4a88f575940f0e3ab1a15b409729e8419ee9828f39ed8e30ed6a9f8a6eb5f0"} Mar 21 05:30:01 crc kubenswrapper[4775]: I0321 05:30:01.353053 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" event={"ID":"a3215b5b-802c-4a03-b546-8e3ebe5f60b2","Type":"ContainerStarted","Data":"d90243f8615cf1c263b1b17b7bb8bda9e62e280a115b1f6d61465f1d0981e792"} Mar 21 05:30:02 crc kubenswrapper[4775]: I0321 05:30:02.366973 4775 generic.go:334] "Generic (PLEG): container finished" podID="a3215b5b-802c-4a03-b546-8e3ebe5f60b2" containerID="e0f13d79ecf702c5e141a30fa8c4b5464aeca1ab255956281f9d561257381418" exitCode=0 Mar 21 05:30:02 crc kubenswrapper[4775]: I0321 05:30:02.367435 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" event={"ID":"a3215b5b-802c-4a03-b546-8e3ebe5f60b2","Type":"ContainerDied","Data":"e0f13d79ecf702c5e141a30fa8c4b5464aeca1ab255956281f9d561257381418"} Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.765808 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.872009 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrsmx\" (UniqueName: \"kubernetes.io/projected/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-kube-api-access-qrsmx\") pod \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.872374 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-config-volume\") pod \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.872582 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-secret-volume\") pod \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\" (UID: \"a3215b5b-802c-4a03-b546-8e3ebe5f60b2\") " Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.873330 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3215b5b-802c-4a03-b546-8e3ebe5f60b2" (UID: "a3215b5b-802c-4a03-b546-8e3ebe5f60b2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.881471 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-kube-api-access-qrsmx" (OuterVolumeSpecName: "kube-api-access-qrsmx") pod "a3215b5b-802c-4a03-b546-8e3ebe5f60b2" (UID: "a3215b5b-802c-4a03-b546-8e3ebe5f60b2"). InnerVolumeSpecName "kube-api-access-qrsmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.896463 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3215b5b-802c-4a03-b546-8e3ebe5f60b2" (UID: "a3215b5b-802c-4a03-b546-8e3ebe5f60b2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.975690 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.976109 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrsmx\" (UniqueName: \"kubernetes.io/projected/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-kube-api-access-qrsmx\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:03 crc kubenswrapper[4775]: I0321 05:30:03.976194 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3215b5b-802c-4a03-b546-8e3ebe5f60b2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4775]: I0321 05:30:04.401311 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" event={"ID":"a3215b5b-802c-4a03-b546-8e3ebe5f60b2","Type":"ContainerDied","Data":"d90243f8615cf1c263b1b17b7bb8bda9e62e280a115b1f6d61465f1d0981e792"} Mar 21 05:30:04 crc kubenswrapper[4775]: I0321 05:30:04.401404 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d90243f8615cf1c263b1b17b7bb8bda9e62e280a115b1f6d61465f1d0981e792" Mar 21 05:30:04 crc kubenswrapper[4775]: I0321 05:30:04.401603 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-mp2j9" Mar 21 05:30:04 crc kubenswrapper[4775]: I0321 05:30:04.404370 4775 generic.go:334] "Generic (PLEG): container finished" podID="2616ee6e-f4b6-4258-b738-95ad6aab1644" containerID="2262a4a2de084f40220b6b85086a44442e4d874dfee1c84bf3dcc589fdc32198" exitCode=0 Mar 21 05:30:04 crc kubenswrapper[4775]: I0321 05:30:04.404410 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-gfbv5" event={"ID":"2616ee6e-f4b6-4258-b738-95ad6aab1644","Type":"ContainerDied","Data":"2262a4a2de084f40220b6b85086a44442e4d874dfee1c84bf3dcc589fdc32198"} Mar 21 05:30:04 crc kubenswrapper[4775]: I0321 05:30:04.851429 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct"] Mar 21 05:30:04 crc kubenswrapper[4775]: I0321 05:30:04.859468 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-97bct"] Mar 21 05:30:05 crc kubenswrapper[4775]: I0321 05:30:05.675297 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5aa6958-e573-4efb-a031-218c62b0bec9" path="/var/lib/kubelet/pods/a5aa6958-e573-4efb-a031-218c62b0bec9/volumes" Mar 21 05:30:05 crc kubenswrapper[4775]: I0321 05:30:05.798926 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-gfbv5" Mar 21 05:30:05 crc kubenswrapper[4775]: I0321 05:30:05.954262 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bf67\" (UniqueName: \"kubernetes.io/projected/2616ee6e-f4b6-4258-b738-95ad6aab1644-kube-api-access-7bf67\") pod \"2616ee6e-f4b6-4258-b738-95ad6aab1644\" (UID: \"2616ee6e-f4b6-4258-b738-95ad6aab1644\") " Mar 21 05:30:05 crc kubenswrapper[4775]: I0321 05:30:05.967505 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2616ee6e-f4b6-4258-b738-95ad6aab1644-kube-api-access-7bf67" (OuterVolumeSpecName: "kube-api-access-7bf67") pod "2616ee6e-f4b6-4258-b738-95ad6aab1644" (UID: "2616ee6e-f4b6-4258-b738-95ad6aab1644"). InnerVolumeSpecName "kube-api-access-7bf67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:06 crc kubenswrapper[4775]: I0321 05:30:06.056744 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bf67\" (UniqueName: \"kubernetes.io/projected/2616ee6e-f4b6-4258-b738-95ad6aab1644-kube-api-access-7bf67\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:06 crc kubenswrapper[4775]: I0321 05:30:06.430977 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-gfbv5" event={"ID":"2616ee6e-f4b6-4258-b738-95ad6aab1644","Type":"ContainerDied","Data":"5d4a88f575940f0e3ab1a15b409729e8419ee9828f39ed8e30ed6a9f8a6eb5f0"} Mar 21 05:30:06 crc kubenswrapper[4775]: I0321 05:30:06.431491 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d4a88f575940f0e3ab1a15b409729e8419ee9828f39ed8e30ed6a9f8a6eb5f0" Mar 21 05:30:06 crc kubenswrapper[4775]: I0321 05:30:06.431057 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-gfbv5" Mar 21 05:30:06 crc kubenswrapper[4775]: I0321 05:30:06.871636 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-25mvs"] Mar 21 05:30:06 crc kubenswrapper[4775]: I0321 05:30:06.885170 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-25mvs"] Mar 21 05:30:07 crc kubenswrapper[4775]: I0321 05:30:07.673704 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6" path="/var/lib/kubelet/pods/c589176d-45ef-4e7a-b8f8-0ec31e4fe6a6/volumes" Mar 21 05:30:08 crc kubenswrapper[4775]: I0321 05:30:08.662771 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:30:08 crc kubenswrapper[4775]: E0321 05:30:08.663107 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:30:21 crc kubenswrapper[4775]: I0321 05:30:21.661808 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:30:21 crc kubenswrapper[4775]: E0321 05:30:21.662833 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:30:35 crc kubenswrapper[4775]: I0321 05:30:35.662405 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:30:36 crc kubenswrapper[4775]: I0321 05:30:36.728031 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"ea40506310b7606d65a9c28447427b7d1714319295a97c7d0f6f7a3b4b8251ee"} Mar 21 05:30:47 crc kubenswrapper[4775]: I0321 05:30:47.611990 4775 scope.go:117] "RemoveContainer" containerID="5936961592e0ba9da2bfbec1bc18d0df5f38a71ab49e8c90edaa7dea4ccb02ee" Mar 21 05:30:47 crc kubenswrapper[4775]: I0321 05:30:47.669875 4775 scope.go:117] "RemoveContainer" containerID="56e35510c790be074d09f5ee8f124d413aaf4e647a0fe13993e83a909111b979" Mar 21 05:30:47 crc kubenswrapper[4775]: I0321 05:30:47.702042 4775 scope.go:117] "RemoveContainer" containerID="1b93f4361b64d716502128254e9ea25b78ba9ed772325baf5c04d50a5a77ab40" Mar 21 05:31:47 crc kubenswrapper[4775]: I0321 05:31:47.808901 4775 scope.go:117] "RemoveContainer" containerID="60fb5be3131696012c97db115a74ff472115d80eb7fd9153e9102904c69c91b3" Mar 21 05:31:47 crc kubenswrapper[4775]: I0321 05:31:47.843259 4775 scope.go:117] "RemoveContainer" containerID="a2a8c8dc1d88f43d6ededf2e8793a512b5087dcbf7e313f245d34485e136258b" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.154137 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567852-hd7nd"] Mar 21 05:32:00 crc kubenswrapper[4775]: E0321 05:32:00.155615 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2616ee6e-f4b6-4258-b738-95ad6aab1644" containerName="oc" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.155635 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2616ee6e-f4b6-4258-b738-95ad6aab1644" containerName="oc" Mar 21 05:32:00 crc kubenswrapper[4775]: E0321 05:32:00.155684 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3215b5b-802c-4a03-b546-8e3ebe5f60b2" containerName="collect-profiles" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.155692 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3215b5b-802c-4a03-b546-8e3ebe5f60b2" containerName="collect-profiles" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.156009 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2616ee6e-f4b6-4258-b738-95ad6aab1644" containerName="oc" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.156030 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3215b5b-802c-4a03-b546-8e3ebe5f60b2" containerName="collect-profiles" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.157164 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-hd7nd" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.163662 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.163831 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.164233 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.166771 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-hd7nd"] Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.216827 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xk4k\" (UniqueName: \"kubernetes.io/projected/15017759-960b-4d6e-8ce9-deef2cb94155-kube-api-access-4xk4k\") pod \"auto-csr-approver-29567852-hd7nd\" (UID: \"15017759-960b-4d6e-8ce9-deef2cb94155\") " pod="openshift-infra/auto-csr-approver-29567852-hd7nd" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.318984 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xk4k\" (UniqueName: \"kubernetes.io/projected/15017759-960b-4d6e-8ce9-deef2cb94155-kube-api-access-4xk4k\") pod \"auto-csr-approver-29567852-hd7nd\" (UID: \"15017759-960b-4d6e-8ce9-deef2cb94155\") " pod="openshift-infra/auto-csr-approver-29567852-hd7nd" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.346908 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xk4k\" (UniqueName: \"kubernetes.io/projected/15017759-960b-4d6e-8ce9-deef2cb94155-kube-api-access-4xk4k\") pod \"auto-csr-approver-29567852-hd7nd\" (UID: \"15017759-960b-4d6e-8ce9-deef2cb94155\") " pod="openshift-infra/auto-csr-approver-29567852-hd7nd" Mar 21 05:32:00 crc kubenswrapper[4775]: I0321 05:32:00.483198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-hd7nd" Mar 21 05:32:01 crc kubenswrapper[4775]: I0321 05:32:01.007670 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-hd7nd"] Mar 21 05:32:01 crc kubenswrapper[4775]: I0321 05:32:01.014546 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:32:01 crc kubenswrapper[4775]: I0321 05:32:01.636163 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-hd7nd" event={"ID":"15017759-960b-4d6e-8ce9-deef2cb94155","Type":"ContainerStarted","Data":"6a3a151d48881d011e498b74117ddba9087c8a3bccd9ea12c290dab570221f0c"} Mar 21 05:32:02 crc kubenswrapper[4775]: I0321 05:32:02.651338 4775 generic.go:334] "Generic (PLEG): container finished" podID="15017759-960b-4d6e-8ce9-deef2cb94155" containerID="25828aaca134034962b25d7ea45e1870919d75c0c4923758d122b6c7bce2397c" exitCode=0 Mar 21 05:32:02 crc kubenswrapper[4775]: I0321 05:32:02.651466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-hd7nd" event={"ID":"15017759-960b-4d6e-8ce9-deef2cb94155","Type":"ContainerDied","Data":"25828aaca134034962b25d7ea45e1870919d75c0c4923758d122b6c7bce2397c"} Mar 21 05:32:02 crc kubenswrapper[4775]: I0321 05:32:02.654622 4775 generic.go:334] "Generic (PLEG): container finished" podID="bccbefa9-966d-44b9-bd8f-bb566649b315" containerID="f29e85380399c91a14fac8a091af8cf23a0384a224751b0b587f895a3d44aec1" exitCode=0 Mar 21 05:32:02 crc kubenswrapper[4775]: I0321 05:32:02.654680 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" event={"ID":"bccbefa9-966d-44b9-bd8f-bb566649b315","Type":"ContainerDied","Data":"f29e85380399c91a14fac8a091af8cf23a0384a224751b0b587f895a3d44aec1"} Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.164145 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-hd7nd" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.170746 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.325590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pskfk\" (UniqueName: \"kubernetes.io/projected/bccbefa9-966d-44b9-bd8f-bb566649b315-kube-api-access-pskfk\") pod \"bccbefa9-966d-44b9-bd8f-bb566649b315\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.325799 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-2\") pod \"bccbefa9-966d-44b9-bd8f-bb566649b315\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.325844 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xk4k\" (UniqueName: \"kubernetes.io/projected/15017759-960b-4d6e-8ce9-deef2cb94155-kube-api-access-4xk4k\") pod \"15017759-960b-4d6e-8ce9-deef2cb94155\" (UID: \"15017759-960b-4d6e-8ce9-deef2cb94155\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.325927 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ssh-key-openstack-edpm-ipam\") pod \"bccbefa9-966d-44b9-bd8f-bb566649b315\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.326040 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-telemetry-combined-ca-bundle\") pod \"bccbefa9-966d-44b9-bd8f-bb566649b315\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.326099 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-0\") pod \"bccbefa9-966d-44b9-bd8f-bb566649b315\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.326172 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-inventory\") pod \"bccbefa9-966d-44b9-bd8f-bb566649b315\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.326243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-1\") pod \"bccbefa9-966d-44b9-bd8f-bb566649b315\" (UID: \"bccbefa9-966d-44b9-bd8f-bb566649b315\") " Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.335654 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccbefa9-966d-44b9-bd8f-bb566649b315-kube-api-access-pskfk" (OuterVolumeSpecName: "kube-api-access-pskfk") pod "bccbefa9-966d-44b9-bd8f-bb566649b315" (UID: "bccbefa9-966d-44b9-bd8f-bb566649b315"). InnerVolumeSpecName "kube-api-access-pskfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.336544 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "bccbefa9-966d-44b9-bd8f-bb566649b315" (UID: "bccbefa9-966d-44b9-bd8f-bb566649b315"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.336791 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15017759-960b-4d6e-8ce9-deef2cb94155-kube-api-access-4xk4k" (OuterVolumeSpecName: "kube-api-access-4xk4k") pod "15017759-960b-4d6e-8ce9-deef2cb94155" (UID: "15017759-960b-4d6e-8ce9-deef2cb94155"). InnerVolumeSpecName "kube-api-access-4xk4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.361441 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "bccbefa9-966d-44b9-bd8f-bb566649b315" (UID: "bccbefa9-966d-44b9-bd8f-bb566649b315"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.369426 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "bccbefa9-966d-44b9-bd8f-bb566649b315" (UID: "bccbefa9-966d-44b9-bd8f-bb566649b315"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.369535 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-inventory" (OuterVolumeSpecName: "inventory") pod "bccbefa9-966d-44b9-bd8f-bb566649b315" (UID: "bccbefa9-966d-44b9-bd8f-bb566649b315"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.372963 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bccbefa9-966d-44b9-bd8f-bb566649b315" (UID: "bccbefa9-966d-44b9-bd8f-bb566649b315"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.374435 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "bccbefa9-966d-44b9-bd8f-bb566649b315" (UID: "bccbefa9-966d-44b9-bd8f-bb566649b315"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430222 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pskfk\" (UniqueName: \"kubernetes.io/projected/bccbefa9-966d-44b9-bd8f-bb566649b315-kube-api-access-pskfk\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430352 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430411 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xk4k\" (UniqueName: \"kubernetes.io/projected/15017759-960b-4d6e-8ce9-deef2cb94155-kube-api-access-4xk4k\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430468 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430520 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430587 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430645 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.430706 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/bccbefa9-966d-44b9-bd8f-bb566649b315-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.675657 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" event={"ID":"bccbefa9-966d-44b9-bd8f-bb566649b315","Type":"ContainerDied","Data":"8636db08b5425e5c464dc8f5aae692b68bdab14b9b01bcd67b6031e570796fa6"} Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.675747 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8636db08b5425e5c464dc8f5aae692b68bdab14b9b01bcd67b6031e570796fa6" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.675752 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.677507 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-hd7nd" event={"ID":"15017759-960b-4d6e-8ce9-deef2cb94155","Type":"ContainerDied","Data":"6a3a151d48881d011e498b74117ddba9087c8a3bccd9ea12c290dab570221f0c"} Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.677551 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3a151d48881d011e498b74117ddba9087c8a3bccd9ea12c290dab570221f0c" Mar 21 05:32:04 crc kubenswrapper[4775]: I0321 05:32:04.677616 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-hd7nd" Mar 21 05:32:05 crc kubenswrapper[4775]: I0321 05:32:05.259558 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-6lslz"] Mar 21 05:32:05 crc kubenswrapper[4775]: I0321 05:32:05.274251 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-6lslz"] Mar 21 05:32:05 crc kubenswrapper[4775]: I0321 05:32:05.673113 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f89ba5-2605-4bdb-bc77-020c6bf9db75" path="/var/lib/kubelet/pods/43f89ba5-2605-4bdb-bc77-020c6bf9db75/volumes" Mar 21 05:32:33 crc kubenswrapper[4775]: E0321 05:32:33.182775 4775 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.200:36540->38.102.83.200:42597: read tcp 38.102.83.200:36540->38.102.83.200:42597: read: connection reset by peer Mar 21 05:32:47 crc kubenswrapper[4775]: I0321 05:32:47.911996 4775 scope.go:117] "RemoveContainer" containerID="a8931f9c5d17f0e3f84a26748f81ac4ac575c43d3a88da2c5e6613b2fee09eee" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.217909 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:32:59 crc kubenswrapper[4775]: E0321 05:32:59.219040 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbefa9-966d-44b9-bd8f-bb566649b315" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.219057 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbefa9-966d-44b9-bd8f-bb566649b315" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:32:59 crc kubenswrapper[4775]: E0321 05:32:59.219082 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15017759-960b-4d6e-8ce9-deef2cb94155" containerName="oc" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.219089 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="15017759-960b-4d6e-8ce9-deef2cb94155" containerName="oc" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.219287 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="15017759-960b-4d6e-8ce9-deef2cb94155" containerName="oc" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.219303 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbefa9-966d-44b9-bd8f-bb566649b315" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.220002 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.223302 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.223698 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.225093 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.226406 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bnv6q" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.246025 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.279600 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-config-data\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.279728 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.279869 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381546 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381623 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381728 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381784 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381872 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdw2v\" (UniqueName: \"kubernetes.io/projected/1c832898-838d-423d-8ad8-512c5ee5706c-kube-api-access-gdw2v\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381912 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-config-data\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.381946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.384146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.385282 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-config-data\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.400324 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484092 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484286 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdw2v\" (UniqueName: \"kubernetes.io/projected/1c832898-838d-423d-8ad8-512c5ee5706c-kube-api-access-gdw2v\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484594 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.484661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.485066 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.489430 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.490462 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.505455 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdw2v\" (UniqueName: \"kubernetes.io/projected/1c832898-838d-423d-8ad8-512c5ee5706c-kube-api-access-gdw2v\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.518317 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " pod="openstack/tempest-tests-tempest" Mar 21 05:32:59 crc kubenswrapper[4775]: I0321 05:32:59.552965 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:33:00 crc kubenswrapper[4775]: I0321 05:33:00.086047 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:33:00 crc kubenswrapper[4775]: I0321 05:33:00.305058 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1c832898-838d-423d-8ad8-512c5ee5706c","Type":"ContainerStarted","Data":"695de001f7207ca22144c3ce97e3ea6306a89b43020987a1d4856f3f9e289416"} Mar 21 05:33:02 crc kubenswrapper[4775]: I0321 05:33:02.482249 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:33:02 crc kubenswrapper[4775]: I0321 05:33:02.483169 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:33:12 crc kubenswrapper[4775]: E0321 05:33:12.008706 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Mar 21 05:33:31 crc kubenswrapper[4775]: E0321 05:33:31.349343 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 21 05:33:31 crc kubenswrapper[4775]: E0321 05:33:31.350316 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdw2v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(1c832898-838d-423d-8ad8-512c5ee5706c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:33:31 crc kubenswrapper[4775]: E0321 05:33:31.351792 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="1c832898-838d-423d-8ad8-512c5ee5706c" Mar 21 05:33:31 crc kubenswrapper[4775]: E0321 05:33:31.776680 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="1c832898-838d-423d-8ad8-512c5ee5706c" Mar 21 05:33:32 crc kubenswrapper[4775]: I0321 05:33:32.482084 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:33:32 crc kubenswrapper[4775]: I0321 05:33:32.482555 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:33:45 crc kubenswrapper[4775]: I0321 05:33:45.426854 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:33:46 crc kubenswrapper[4775]: I0321 05:33:46.945670 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1c832898-838d-423d-8ad8-512c5ee5706c","Type":"ContainerStarted","Data":"e5d5a6749ca728ee30628ed8c23d0df8c8193d878227278bf224646abd9c8806"} Mar 21 05:33:46 crc kubenswrapper[4775]: I0321 05:33:46.972863 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.6497522570000003 podStartE2EDuration="48.972841682s" podCreationTimestamp="2026-03-21 05:32:58 +0000 UTC" firstStartedPulling="2026-03-21 05:33:00.099423383 +0000 UTC m=+2733.075887047" lastFinishedPulling="2026-03-21 05:33:45.422512798 +0000 UTC m=+2778.398976472" observedRunningTime="2026-03-21 05:33:46.969324291 +0000 UTC m=+2779.945787925" watchObservedRunningTime="2026-03-21 05:33:46.972841682 +0000 UTC m=+2779.949305306" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.367616 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4chpg"] Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.372667 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.388433 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4chpg"] Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.485896 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-utilities\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.486092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-catalog-content\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.486204 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q46w\" (UniqueName: \"kubernetes.io/projected/501afe19-8db4-4d25-8e06-56d1449b3643-kube-api-access-8q46w\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.588135 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q46w\" (UniqueName: \"kubernetes.io/projected/501afe19-8db4-4d25-8e06-56d1449b3643-kube-api-access-8q46w\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.588240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-utilities\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.588361 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-catalog-content\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.588977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-catalog-content\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.589613 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-utilities\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.612179 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q46w\" (UniqueName: \"kubernetes.io/projected/501afe19-8db4-4d25-8e06-56d1449b3643-kube-api-access-8q46w\") pod \"certified-operators-4chpg\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:50 crc kubenswrapper[4775]: I0321 05:33:50.701291 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:33:51 crc kubenswrapper[4775]: I0321 05:33:51.314668 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4chpg"] Mar 21 05:33:52 crc kubenswrapper[4775]: I0321 05:33:51.999601 4775 generic.go:334] "Generic (PLEG): container finished" podID="501afe19-8db4-4d25-8e06-56d1449b3643" containerID="4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd" exitCode=0 Mar 21 05:33:52 crc kubenswrapper[4775]: I0321 05:33:51.999666 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4chpg" event={"ID":"501afe19-8db4-4d25-8e06-56d1449b3643","Type":"ContainerDied","Data":"4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd"} Mar 21 05:33:52 crc kubenswrapper[4775]: I0321 05:33:52.000004 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4chpg" event={"ID":"501afe19-8db4-4d25-8e06-56d1449b3643","Type":"ContainerStarted","Data":"07eb8f05c11ebbbb5bb476c41bb59b876a26256bd2f154d72bba449f69c622df"} Mar 21 05:33:54 crc kubenswrapper[4775]: I0321 05:33:54.026233 4775 generic.go:334] "Generic (PLEG): container finished" podID="501afe19-8db4-4d25-8e06-56d1449b3643" containerID="c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029" exitCode=0 Mar 21 05:33:54 crc kubenswrapper[4775]: I0321 05:33:54.026296 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4chpg" event={"ID":"501afe19-8db4-4d25-8e06-56d1449b3643","Type":"ContainerDied","Data":"c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029"} Mar 21 05:33:55 crc kubenswrapper[4775]: I0321 05:33:55.042095 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4chpg" event={"ID":"501afe19-8db4-4d25-8e06-56d1449b3643","Type":"ContainerStarted","Data":"711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b"} Mar 21 05:33:55 crc kubenswrapper[4775]: I0321 05:33:55.072922 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4chpg" podStartSLOduration=2.584060545 podStartE2EDuration="5.072897873s" podCreationTimestamp="2026-03-21 05:33:50 +0000 UTC" firstStartedPulling="2026-03-21 05:33:52.006080577 +0000 UTC m=+2784.982544201" lastFinishedPulling="2026-03-21 05:33:54.494917905 +0000 UTC m=+2787.471381529" observedRunningTime="2026-03-21 05:33:55.062922335 +0000 UTC m=+2788.039385969" watchObservedRunningTime="2026-03-21 05:33:55.072897873 +0000 UTC m=+2788.049361497" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.146167 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567854-gxf82"] Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.148172 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-gxf82" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.150883 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.150884 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.152321 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.158461 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-gxf82"] Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.317045 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7kzr\" (UniqueName: \"kubernetes.io/projected/afdb9c39-b2aa-496c-8cfb-917fd0b1cc15-kube-api-access-b7kzr\") pod \"auto-csr-approver-29567854-gxf82\" (UID: \"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15\") " pod="openshift-infra/auto-csr-approver-29567854-gxf82" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.420108 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7kzr\" (UniqueName: \"kubernetes.io/projected/afdb9c39-b2aa-496c-8cfb-917fd0b1cc15-kube-api-access-b7kzr\") pod \"auto-csr-approver-29567854-gxf82\" (UID: \"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15\") " pod="openshift-infra/auto-csr-approver-29567854-gxf82" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.451192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7kzr\" (UniqueName: \"kubernetes.io/projected/afdb9c39-b2aa-496c-8cfb-917fd0b1cc15-kube-api-access-b7kzr\") pod \"auto-csr-approver-29567854-gxf82\" (UID: \"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15\") " pod="openshift-infra/auto-csr-approver-29567854-gxf82" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.474541 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-gxf82" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.702222 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.702596 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:34:00 crc kubenswrapper[4775]: I0321 05:34:00.777731 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:34:01 crc kubenswrapper[4775]: I0321 05:34:01.040104 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-gxf82"] Mar 21 05:34:01 crc kubenswrapper[4775]: I0321 05:34:01.107590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-gxf82" event={"ID":"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15","Type":"ContainerStarted","Data":"0d244af90f1f3cc12a466a4d5d229c239003d7b502e8be0d3f5edcd24ca33e5a"} Mar 21 05:34:01 crc kubenswrapper[4775]: I0321 05:34:01.161663 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:34:01 crc kubenswrapper[4775]: I0321 05:34:01.213955 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4chpg"] Mar 21 05:34:02 crc kubenswrapper[4775]: I0321 05:34:02.482840 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:34:02 crc kubenswrapper[4775]: I0321 05:34:02.483365 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:34:02 crc kubenswrapper[4775]: I0321 05:34:02.483414 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:34:02 crc kubenswrapper[4775]: I0321 05:34:02.485368 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea40506310b7606d65a9c28447427b7d1714319295a97c7d0f6f7a3b4b8251ee"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:34:02 crc kubenswrapper[4775]: I0321 05:34:02.485420 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://ea40506310b7606d65a9c28447427b7d1714319295a97c7d0f6f7a3b4b8251ee" gracePeriod=600 Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.129897 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-gxf82" event={"ID":"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15","Type":"ContainerStarted","Data":"20714a401828b9647e317e22423f1e191dc985cb13d154d0fa57a94896564896"} Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.133093 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="ea40506310b7606d65a9c28447427b7d1714319295a97c7d0f6f7a3b4b8251ee" exitCode=0 Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.133178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"ea40506310b7606d65a9c28447427b7d1714319295a97c7d0f6f7a3b4b8251ee"} Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.133263 4775 scope.go:117] "RemoveContainer" containerID="21dc53d475648a02ba4409f119cfb163b347fd5f7ea3ee585b368e11bf55d804" Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.133349 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4chpg" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="registry-server" containerID="cri-o://711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b" gracePeriod=2 Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.154019 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567854-gxf82" podStartSLOduration=1.902136061 podStartE2EDuration="3.153998978s" podCreationTimestamp="2026-03-21 05:34:00 +0000 UTC" firstStartedPulling="2026-03-21 05:34:01.056372519 +0000 UTC m=+2794.032836143" lastFinishedPulling="2026-03-21 05:34:02.308235406 +0000 UTC m=+2795.284699060" observedRunningTime="2026-03-21 05:34:03.148666904 +0000 UTC m=+2796.125130548" watchObservedRunningTime="2026-03-21 05:34:03.153998978 +0000 UTC m=+2796.130462592" Mar 21 05:34:03 crc kubenswrapper[4775]: E0321 05:34:03.443709 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod501afe19_8db4_4d25_8e06_56d1449b3643.slice/crio-711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.748084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.922329 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q46w\" (UniqueName: \"kubernetes.io/projected/501afe19-8db4-4d25-8e06-56d1449b3643-kube-api-access-8q46w\") pod \"501afe19-8db4-4d25-8e06-56d1449b3643\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.923211 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-catalog-content\") pod \"501afe19-8db4-4d25-8e06-56d1449b3643\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.923268 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-utilities\") pod \"501afe19-8db4-4d25-8e06-56d1449b3643\" (UID: \"501afe19-8db4-4d25-8e06-56d1449b3643\") " Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.924846 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-utilities" (OuterVolumeSpecName: "utilities") pod "501afe19-8db4-4d25-8e06-56d1449b3643" (UID: "501afe19-8db4-4d25-8e06-56d1449b3643"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.947201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501afe19-8db4-4d25-8e06-56d1449b3643-kube-api-access-8q46w" (OuterVolumeSpecName: "kube-api-access-8q46w") pod "501afe19-8db4-4d25-8e06-56d1449b3643" (UID: "501afe19-8db4-4d25-8e06-56d1449b3643"). InnerVolumeSpecName "kube-api-access-8q46w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:34:03 crc kubenswrapper[4775]: I0321 05:34:03.988518 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "501afe19-8db4-4d25-8e06-56d1449b3643" (UID: "501afe19-8db4-4d25-8e06-56d1449b3643"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.026527 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.026575 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q46w\" (UniqueName: \"kubernetes.io/projected/501afe19-8db4-4d25-8e06-56d1449b3643-kube-api-access-8q46w\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.026588 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/501afe19-8db4-4d25-8e06-56d1449b3643-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.148162 4775 generic.go:334] "Generic (PLEG): container finished" podID="afdb9c39-b2aa-496c-8cfb-917fd0b1cc15" containerID="20714a401828b9647e317e22423f1e191dc985cb13d154d0fa57a94896564896" exitCode=0 Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.148263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-gxf82" event={"ID":"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15","Type":"ContainerDied","Data":"20714a401828b9647e317e22423f1e191dc985cb13d154d0fa57a94896564896"} Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.153160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9"} Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.158159 4775 generic.go:334] "Generic (PLEG): container finished" podID="501afe19-8db4-4d25-8e06-56d1449b3643" containerID="711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b" exitCode=0 Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.158216 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4chpg" event={"ID":"501afe19-8db4-4d25-8e06-56d1449b3643","Type":"ContainerDied","Data":"711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b"} Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.158254 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4chpg" event={"ID":"501afe19-8db4-4d25-8e06-56d1449b3643","Type":"ContainerDied","Data":"07eb8f05c11ebbbb5bb476c41bb59b876a26256bd2f154d72bba449f69c622df"} Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.158277 4775 scope.go:117] "RemoveContainer" containerID="711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.158277 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4chpg" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.187549 4775 scope.go:117] "RemoveContainer" containerID="c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.223484 4775 scope.go:117] "RemoveContainer" containerID="4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.223534 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4chpg"] Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.237092 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4chpg"] Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.279132 4775 scope.go:117] "RemoveContainer" containerID="711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b" Mar 21 05:34:04 crc kubenswrapper[4775]: E0321 05:34:04.279745 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b\": container with ID starting with 711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b not found: ID does not exist" containerID="711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.279840 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b"} err="failed to get container status \"711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b\": rpc error: code = NotFound desc = could not find container \"711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b\": container with ID starting with 711a82a5ea8d72323dd172e53eaa58caf06be2ca6f1b2822acee6887a437466b not found: ID does not exist" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.279901 4775 scope.go:117] "RemoveContainer" containerID="c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029" Mar 21 05:34:04 crc kubenswrapper[4775]: E0321 05:34:04.280441 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029\": container with ID starting with c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029 not found: ID does not exist" containerID="c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.280497 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029"} err="failed to get container status \"c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029\": rpc error: code = NotFound desc = could not find container \"c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029\": container with ID starting with c1908a0d86835431c41047756017337154557cff7b1ff87a8e9a3ad7ba39c029 not found: ID does not exist" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.280519 4775 scope.go:117] "RemoveContainer" containerID="4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd" Mar 21 05:34:04 crc kubenswrapper[4775]: E0321 05:34:04.280897 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd\": container with ID starting with 4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd not found: ID does not exist" containerID="4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd" Mar 21 05:34:04 crc kubenswrapper[4775]: I0321 05:34:04.280935 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd"} err="failed to get container status \"4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd\": rpc error: code = NotFound desc = could not find container \"4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd\": container with ID starting with 4237a001ec9e64bac729ab0c016c015a46734930328f1c25302192aa4945b0bd not found: ID does not exist" Mar 21 05:34:05 crc kubenswrapper[4775]: I0321 05:34:05.578531 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-gxf82" Mar 21 05:34:05 crc kubenswrapper[4775]: I0321 05:34:05.668629 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7kzr\" (UniqueName: \"kubernetes.io/projected/afdb9c39-b2aa-496c-8cfb-917fd0b1cc15-kube-api-access-b7kzr\") pod \"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15\" (UID: \"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15\") " Mar 21 05:34:05 crc kubenswrapper[4775]: I0321 05:34:05.672575 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" path="/var/lib/kubelet/pods/501afe19-8db4-4d25-8e06-56d1449b3643/volumes" Mar 21 05:34:05 crc kubenswrapper[4775]: I0321 05:34:05.681545 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afdb9c39-b2aa-496c-8cfb-917fd0b1cc15-kube-api-access-b7kzr" (OuterVolumeSpecName: "kube-api-access-b7kzr") pod "afdb9c39-b2aa-496c-8cfb-917fd0b1cc15" (UID: "afdb9c39-b2aa-496c-8cfb-917fd0b1cc15"). InnerVolumeSpecName "kube-api-access-b7kzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:34:05 crc kubenswrapper[4775]: I0321 05:34:05.772197 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7kzr\" (UniqueName: \"kubernetes.io/projected/afdb9c39-b2aa-496c-8cfb-917fd0b1cc15-kube-api-access-b7kzr\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:06 crc kubenswrapper[4775]: I0321 05:34:06.180334 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-gxf82" event={"ID":"afdb9c39-b2aa-496c-8cfb-917fd0b1cc15","Type":"ContainerDied","Data":"0d244af90f1f3cc12a466a4d5d229c239003d7b502e8be0d3f5edcd24ca33e5a"} Mar 21 05:34:06 crc kubenswrapper[4775]: I0321 05:34:06.180386 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d244af90f1f3cc12a466a4d5d229c239003d7b502e8be0d3f5edcd24ca33e5a" Mar 21 05:34:06 crc kubenswrapper[4775]: I0321 05:34:06.180460 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-gxf82" Mar 21 05:34:06 crc kubenswrapper[4775]: I0321 05:34:06.244087 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-6lggf"] Mar 21 05:34:06 crc kubenswrapper[4775]: I0321 05:34:06.255596 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-6lggf"] Mar 21 05:34:07 crc kubenswrapper[4775]: I0321 05:34:07.683723 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6c8144-5186-4a29-a208-b929f47b1695" path="/var/lib/kubelet/pods/5d6c8144-5186-4a29-a208-b929f47b1695/volumes" Mar 21 05:34:48 crc kubenswrapper[4775]: I0321 05:34:48.042931 4775 scope.go:117] "RemoveContainer" containerID="b0bde1a2f3bc4718dd2d5ab50594a32268b2724339820463f90a790bd3794b31" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.461170 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-frvc6"] Mar 21 05:35:42 crc kubenswrapper[4775]: E0321 05:35:42.462415 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afdb9c39-b2aa-496c-8cfb-917fd0b1cc15" containerName="oc" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.462430 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="afdb9c39-b2aa-496c-8cfb-917fd0b1cc15" containerName="oc" Mar 21 05:35:42 crc kubenswrapper[4775]: E0321 05:35:42.462444 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="extract-utilities" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.462451 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="extract-utilities" Mar 21 05:35:42 crc kubenswrapper[4775]: E0321 05:35:42.462474 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="extract-content" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.462480 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="extract-content" Mar 21 05:35:42 crc kubenswrapper[4775]: E0321 05:35:42.462505 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="registry-server" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.462511 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="registry-server" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.462697 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="afdb9c39-b2aa-496c-8cfb-917fd0b1cc15" containerName="oc" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.462713 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="501afe19-8db4-4d25-8e06-56d1449b3643" containerName="registry-server" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.464176 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.478895 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frvc6"] Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.594702 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-catalog-content\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.594784 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-utilities\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.594864 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spsl5\" (UniqueName: \"kubernetes.io/projected/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-kube-api-access-spsl5\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.697297 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-catalog-content\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.697362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-utilities\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.697423 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spsl5\" (UniqueName: \"kubernetes.io/projected/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-kube-api-access-spsl5\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.698484 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-catalog-content\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.699141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-utilities\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.720785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spsl5\" (UniqueName: \"kubernetes.io/projected/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-kube-api-access-spsl5\") pod \"community-operators-frvc6\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:42 crc kubenswrapper[4775]: I0321 05:35:42.794421 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:43 crc kubenswrapper[4775]: I0321 05:35:43.438241 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frvc6"] Mar 21 05:35:44 crc kubenswrapper[4775]: I0321 05:35:44.047287 4775 generic.go:334] "Generic (PLEG): container finished" podID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerID="db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de" exitCode=0 Mar 21 05:35:44 crc kubenswrapper[4775]: I0321 05:35:44.047399 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frvc6" event={"ID":"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4","Type":"ContainerDied","Data":"db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de"} Mar 21 05:35:44 crc kubenswrapper[4775]: I0321 05:35:44.047704 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frvc6" event={"ID":"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4","Type":"ContainerStarted","Data":"8ac4aab02208f399e31bb5fece0a658d82b8219e89b0629a802dfd4e78ef1fe1"} Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.237797 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xqxgq"] Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.241359 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.269029 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqxgq"] Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.328699 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-utilities\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.328848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-catalog-content\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.328960 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtkjr\" (UniqueName: \"kubernetes.io/projected/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-kube-api-access-xtkjr\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.431669 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-catalog-content\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.431842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtkjr\" (UniqueName: \"kubernetes.io/projected/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-kube-api-access-xtkjr\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.431955 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-utilities\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.432477 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-utilities\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.432738 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-catalog-content\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.455569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtkjr\" (UniqueName: \"kubernetes.io/projected/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-kube-api-access-xtkjr\") pod \"redhat-marketplace-xqxgq\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.616416 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:46 crc kubenswrapper[4775]: I0321 05:35:46.770357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frvc6" event={"ID":"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4","Type":"ContainerStarted","Data":"8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f"} Mar 21 05:35:47 crc kubenswrapper[4775]: I0321 05:35:47.162469 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqxgq"] Mar 21 05:35:47 crc kubenswrapper[4775]: I0321 05:35:47.782175 4775 generic.go:334] "Generic (PLEG): container finished" podID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerID="7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6" exitCode=0 Mar 21 05:35:47 crc kubenswrapper[4775]: I0321 05:35:47.782242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqxgq" event={"ID":"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd","Type":"ContainerDied","Data":"7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6"} Mar 21 05:35:47 crc kubenswrapper[4775]: I0321 05:35:47.783658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqxgq" event={"ID":"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd","Type":"ContainerStarted","Data":"2e22d3f3fe30d01cb59c63a65374d6f6523a902e99ea14cb36acbf8af271d8b6"} Mar 21 05:35:48 crc kubenswrapper[4775]: I0321 05:35:48.794943 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqxgq" event={"ID":"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd","Type":"ContainerStarted","Data":"8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f"} Mar 21 05:35:48 crc kubenswrapper[4775]: I0321 05:35:48.798666 4775 generic.go:334] "Generic (PLEG): container finished" podID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerID="8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f" exitCode=0 Mar 21 05:35:48 crc kubenswrapper[4775]: I0321 05:35:48.798736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frvc6" event={"ID":"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4","Type":"ContainerDied","Data":"8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f"} Mar 21 05:35:49 crc kubenswrapper[4775]: I0321 05:35:49.816049 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frvc6" event={"ID":"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4","Type":"ContainerStarted","Data":"709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c"} Mar 21 05:35:49 crc kubenswrapper[4775]: I0321 05:35:49.820237 4775 generic.go:334] "Generic (PLEG): container finished" podID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerID="8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f" exitCode=0 Mar 21 05:35:49 crc kubenswrapper[4775]: I0321 05:35:49.820413 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqxgq" event={"ID":"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd","Type":"ContainerDied","Data":"8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f"} Mar 21 05:35:49 crc kubenswrapper[4775]: I0321 05:35:49.847962 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-frvc6" podStartSLOduration=2.658405369 podStartE2EDuration="7.847939603s" podCreationTimestamp="2026-03-21 05:35:42 +0000 UTC" firstStartedPulling="2026-03-21 05:35:44.050009914 +0000 UTC m=+2897.026473538" lastFinishedPulling="2026-03-21 05:35:49.239544148 +0000 UTC m=+2902.216007772" observedRunningTime="2026-03-21 05:35:49.840495118 +0000 UTC m=+2902.816958742" watchObservedRunningTime="2026-03-21 05:35:49.847939603 +0000 UTC m=+2902.824403237" Mar 21 05:35:50 crc kubenswrapper[4775]: I0321 05:35:50.849507 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqxgq" event={"ID":"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd","Type":"ContainerStarted","Data":"30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2"} Mar 21 05:35:50 crc kubenswrapper[4775]: I0321 05:35:50.873410 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xqxgq" podStartSLOduration=2.427559941 podStartE2EDuration="4.87338716s" podCreationTimestamp="2026-03-21 05:35:46 +0000 UTC" firstStartedPulling="2026-03-21 05:35:47.78496959 +0000 UTC m=+2900.761433214" lastFinishedPulling="2026-03-21 05:35:50.230796809 +0000 UTC m=+2903.207260433" observedRunningTime="2026-03-21 05:35:50.872529786 +0000 UTC m=+2903.848993420" watchObservedRunningTime="2026-03-21 05:35:50.87338716 +0000 UTC m=+2903.849850784" Mar 21 05:35:52 crc kubenswrapper[4775]: I0321 05:35:52.794920 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:52 crc kubenswrapper[4775]: I0321 05:35:52.796341 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:35:53 crc kubenswrapper[4775]: I0321 05:35:53.849683 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-frvc6" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="registry-server" probeResult="failure" output=< Mar 21 05:35:53 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:35:53 crc kubenswrapper[4775]: > Mar 21 05:35:56 crc kubenswrapper[4775]: I0321 05:35:56.616907 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:56 crc kubenswrapper[4775]: I0321 05:35:56.617339 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:56 crc kubenswrapper[4775]: I0321 05:35:56.670245 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:56 crc kubenswrapper[4775]: I0321 05:35:56.955288 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:35:57 crc kubenswrapper[4775]: I0321 05:35:57.011287 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqxgq"] Mar 21 05:35:58 crc kubenswrapper[4775]: I0321 05:35:58.940328 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xqxgq" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="registry-server" containerID="cri-o://30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2" gracePeriod=2 Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.572300 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.745847 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-utilities\") pod \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.746012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtkjr\" (UniqueName: \"kubernetes.io/projected/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-kube-api-access-xtkjr\") pod \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.746165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-catalog-content\") pod \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\" (UID: \"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd\") " Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.747065 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-utilities" (OuterVolumeSpecName: "utilities") pod "d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" (UID: "d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.753099 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-kube-api-access-xtkjr" (OuterVolumeSpecName: "kube-api-access-xtkjr") pod "d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" (UID: "d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd"). InnerVolumeSpecName "kube-api-access-xtkjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.772571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" (UID: "d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.849349 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.849391 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtkjr\" (UniqueName: \"kubernetes.io/projected/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-kube-api-access-xtkjr\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.849401 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.951052 4775 generic.go:334] "Generic (PLEG): container finished" podID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerID="30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2" exitCode=0 Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.951096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqxgq" event={"ID":"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd","Type":"ContainerDied","Data":"30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2"} Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.951163 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqxgq" event={"ID":"d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd","Type":"ContainerDied","Data":"2e22d3f3fe30d01cb59c63a65374d6f6523a902e99ea14cb36acbf8af271d8b6"} Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.951184 4775 scope.go:117] "RemoveContainer" containerID="30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.951201 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqxgq" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.984415 4775 scope.go:117] "RemoveContainer" containerID="8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:35:59.993936 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqxgq"] Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.003592 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqxgq"] Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.009334 4775 scope.go:117] "RemoveContainer" containerID="7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.062275 4775 scope.go:117] "RemoveContainer" containerID="30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2" Mar 21 05:36:01 crc kubenswrapper[4775]: E0321 05:36:00.062887 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2\": container with ID starting with 30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2 not found: ID does not exist" containerID="30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.062925 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2"} err="failed to get container status \"30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2\": rpc error: code = NotFound desc = could not find container \"30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2\": container with ID starting with 30e789e403772d7ffce3812d73fb279cd48892d4950ddba21d34141ef557b9c2 not found: ID does not exist" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.062947 4775 scope.go:117] "RemoveContainer" containerID="8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f" Mar 21 05:36:01 crc kubenswrapper[4775]: E0321 05:36:00.063436 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f\": container with ID starting with 8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f not found: ID does not exist" containerID="8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.063511 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f"} err="failed to get container status \"8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f\": rpc error: code = NotFound desc = could not find container \"8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f\": container with ID starting with 8305d05030b8b6f9feaa1265bdc85f418d6ed22420c9ec22443e3decb737048f not found: ID does not exist" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.063541 4775 scope.go:117] "RemoveContainer" containerID="7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6" Mar 21 05:36:01 crc kubenswrapper[4775]: E0321 05:36:00.064055 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6\": container with ID starting with 7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6 not found: ID does not exist" containerID="7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.064083 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6"} err="failed to get container status \"7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6\": rpc error: code = NotFound desc = could not find container \"7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6\": container with ID starting with 7555adb7dfe7a1ef65a02bb7b67511ce5cb7651c9d7d951e7c2e95e2b9054af6 not found: ID does not exist" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.155036 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567856-vjs55"] Mar 21 05:36:01 crc kubenswrapper[4775]: E0321 05:36:00.155740 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="extract-utilities" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.155761 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="extract-utilities" Mar 21 05:36:01 crc kubenswrapper[4775]: E0321 05:36:00.155775 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="extract-content" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.155804 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="extract-content" Mar 21 05:36:01 crc kubenswrapper[4775]: E0321 05:36:00.155845 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="registry-server" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.155853 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="registry-server" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.156097 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" containerName="registry-server" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.156883 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-vjs55" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.159778 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.160075 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.160438 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.170935 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-vjs55"] Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.256648 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwxb\" (UniqueName: \"kubernetes.io/projected/d0b49c68-9fa7-444c-b3cd-4bef5eff4765-kube-api-access-mtwxb\") pod \"auto-csr-approver-29567856-vjs55\" (UID: \"d0b49c68-9fa7-444c-b3cd-4bef5eff4765\") " pod="openshift-infra/auto-csr-approver-29567856-vjs55" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.359187 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwxb\" (UniqueName: \"kubernetes.io/projected/d0b49c68-9fa7-444c-b3cd-4bef5eff4765-kube-api-access-mtwxb\") pod \"auto-csr-approver-29567856-vjs55\" (UID: \"d0b49c68-9fa7-444c-b3cd-4bef5eff4765\") " pod="openshift-infra/auto-csr-approver-29567856-vjs55" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.395620 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwxb\" (UniqueName: \"kubernetes.io/projected/d0b49c68-9fa7-444c-b3cd-4bef5eff4765-kube-api-access-mtwxb\") pod \"auto-csr-approver-29567856-vjs55\" (UID: \"d0b49c68-9fa7-444c-b3cd-4bef5eff4765\") " pod="openshift-infra/auto-csr-approver-29567856-vjs55" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:00.479323 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-vjs55" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:01.673955 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd" path="/var/lib/kubelet/pods/d1cf9da2-cb82-4e41-b9e6-5830ab22e5bd/volumes" Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:01.714996 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-vjs55"] Mar 21 05:36:01 crc kubenswrapper[4775]: W0321 05:36:01.726349 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b49c68_9fa7_444c_b3cd_4bef5eff4765.slice/crio-5eb1c74e4a6841837e571124111b7f03c3a51fcbc60071ac45babd10071eab92 WatchSource:0}: Error finding container 5eb1c74e4a6841837e571124111b7f03c3a51fcbc60071ac45babd10071eab92: Status 404 returned error can't find the container with id 5eb1c74e4a6841837e571124111b7f03c3a51fcbc60071ac45babd10071eab92 Mar 21 05:36:01 crc kubenswrapper[4775]: I0321 05:36:01.971709 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-vjs55" event={"ID":"d0b49c68-9fa7-444c-b3cd-4bef5eff4765","Type":"ContainerStarted","Data":"5eb1c74e4a6841837e571124111b7f03c3a51fcbc60071ac45babd10071eab92"} Mar 21 05:36:02 crc kubenswrapper[4775]: I0321 05:36:02.861998 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:36:02 crc kubenswrapper[4775]: I0321 05:36:02.993547 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:36:03 crc kubenswrapper[4775]: I0321 05:36:03.314746 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frvc6"] Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.008887 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-frvc6" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="registry-server" containerID="cri-o://709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c" gracePeriod=2 Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.619320 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.678071 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-utilities\") pod \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.678314 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spsl5\" (UniqueName: \"kubernetes.io/projected/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-kube-api-access-spsl5\") pod \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.678488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-catalog-content\") pod \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\" (UID: \"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4\") " Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.679671 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-utilities" (OuterVolumeSpecName: "utilities") pod "7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" (UID: "7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.686844 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-kube-api-access-spsl5" (OuterVolumeSpecName: "kube-api-access-spsl5") pod "7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" (UID: "7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4"). InnerVolumeSpecName "kube-api-access-spsl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.745941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" (UID: "7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.781409 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.781453 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:04 crc kubenswrapper[4775]: I0321 05:36:04.781471 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spsl5\" (UniqueName: \"kubernetes.io/projected/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4-kube-api-access-spsl5\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.025385 4775 generic.go:334] "Generic (PLEG): container finished" podID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerID="709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c" exitCode=0 Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.025468 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frvc6" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.025482 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frvc6" event={"ID":"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4","Type":"ContainerDied","Data":"709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c"} Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.025702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frvc6" event={"ID":"7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4","Type":"ContainerDied","Data":"8ac4aab02208f399e31bb5fece0a658d82b8219e89b0629a802dfd4e78ef1fe1"} Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.025729 4775 scope.go:117] "RemoveContainer" containerID="709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.030392 4775 generic.go:334] "Generic (PLEG): container finished" podID="d0b49c68-9fa7-444c-b3cd-4bef5eff4765" containerID="fe0d6a170f19e7078dcaeb489e7d9b4bba7aef3e240f5e59424f9dea5b0d382f" exitCode=0 Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.030454 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-vjs55" event={"ID":"d0b49c68-9fa7-444c-b3cd-4bef5eff4765","Type":"ContainerDied","Data":"fe0d6a170f19e7078dcaeb489e7d9b4bba7aef3e240f5e59424f9dea5b0d382f"} Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.053869 4775 scope.go:117] "RemoveContainer" containerID="8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.072255 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frvc6"] Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.080237 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-frvc6"] Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.109233 4775 scope.go:117] "RemoveContainer" containerID="db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.134540 4775 scope.go:117] "RemoveContainer" containerID="709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c" Mar 21 05:36:05 crc kubenswrapper[4775]: E0321 05:36:05.135091 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c\": container with ID starting with 709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c not found: ID does not exist" containerID="709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.135232 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c"} err="failed to get container status \"709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c\": rpc error: code = NotFound desc = could not find container \"709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c\": container with ID starting with 709878fa6db14a1a4c0041409907d252d31e1ef59aa788ccad3dcfcb237f419c not found: ID does not exist" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.135272 4775 scope.go:117] "RemoveContainer" containerID="8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f" Mar 21 05:36:05 crc kubenswrapper[4775]: E0321 05:36:05.135722 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f\": container with ID starting with 8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f not found: ID does not exist" containerID="8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.135802 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f"} err="failed to get container status \"8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f\": rpc error: code = NotFound desc = could not find container \"8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f\": container with ID starting with 8df7790e4cf98b44556ff5c488018f03fe2a353bc67457fc94935e62f20efe2f not found: ID does not exist" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.135832 4775 scope.go:117] "RemoveContainer" containerID="db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de" Mar 21 05:36:05 crc kubenswrapper[4775]: E0321 05:36:05.136179 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de\": container with ID starting with db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de not found: ID does not exist" containerID="db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.136207 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de"} err="failed to get container status \"db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de\": rpc error: code = NotFound desc = could not find container \"db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de\": container with ID starting with db650e0f182ffb5e56cbba3ab5adf162afa3a564247002767dee7622fc9639de not found: ID does not exist" Mar 21 05:36:05 crc kubenswrapper[4775]: I0321 05:36:05.673174 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" path="/var/lib/kubelet/pods/7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4/volumes" Mar 21 05:36:06 crc kubenswrapper[4775]: I0321 05:36:06.425233 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-vjs55" Mar 21 05:36:06 crc kubenswrapper[4775]: I0321 05:36:06.524483 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtwxb\" (UniqueName: \"kubernetes.io/projected/d0b49c68-9fa7-444c-b3cd-4bef5eff4765-kube-api-access-mtwxb\") pod \"d0b49c68-9fa7-444c-b3cd-4bef5eff4765\" (UID: \"d0b49c68-9fa7-444c-b3cd-4bef5eff4765\") " Mar 21 05:36:06 crc kubenswrapper[4775]: I0321 05:36:06.532726 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b49c68-9fa7-444c-b3cd-4bef5eff4765-kube-api-access-mtwxb" (OuterVolumeSpecName: "kube-api-access-mtwxb") pod "d0b49c68-9fa7-444c-b3cd-4bef5eff4765" (UID: "d0b49c68-9fa7-444c-b3cd-4bef5eff4765"). InnerVolumeSpecName "kube-api-access-mtwxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:36:06 crc kubenswrapper[4775]: I0321 05:36:06.627237 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtwxb\" (UniqueName: \"kubernetes.io/projected/d0b49c68-9fa7-444c-b3cd-4bef5eff4765-kube-api-access-mtwxb\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:07 crc kubenswrapper[4775]: I0321 05:36:07.054407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-vjs55" event={"ID":"d0b49c68-9fa7-444c-b3cd-4bef5eff4765","Type":"ContainerDied","Data":"5eb1c74e4a6841837e571124111b7f03c3a51fcbc60071ac45babd10071eab92"} Mar 21 05:36:07 crc kubenswrapper[4775]: I0321 05:36:07.054459 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eb1c74e4a6841837e571124111b7f03c3a51fcbc60071ac45babd10071eab92" Mar 21 05:36:07 crc kubenswrapper[4775]: I0321 05:36:07.054769 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-vjs55" Mar 21 05:36:07 crc kubenswrapper[4775]: I0321 05:36:07.507717 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-gfbv5"] Mar 21 05:36:07 crc kubenswrapper[4775]: I0321 05:36:07.519963 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-gfbv5"] Mar 21 05:36:07 crc kubenswrapper[4775]: I0321 05:36:07.671938 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2616ee6e-f4b6-4258-b738-95ad6aab1644" path="/var/lib/kubelet/pods/2616ee6e-f4b6-4258-b738-95ad6aab1644/volumes" Mar 21 05:36:32 crc kubenswrapper[4775]: I0321 05:36:32.482713 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:36:32 crc kubenswrapper[4775]: I0321 05:36:32.483774 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:36:48 crc kubenswrapper[4775]: I0321 05:36:48.171848 4775 scope.go:117] "RemoveContainer" containerID="2262a4a2de084f40220b6b85086a44442e4d874dfee1c84bf3dcc589fdc32198" Mar 21 05:37:02 crc kubenswrapper[4775]: I0321 05:37:02.482373 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:37:02 crc kubenswrapper[4775]: I0321 05:37:02.483020 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:37:32 crc kubenswrapper[4775]: I0321 05:37:32.482584 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:37:32 crc kubenswrapper[4775]: I0321 05:37:32.483105 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:37:32 crc kubenswrapper[4775]: I0321 05:37:32.483168 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:37:32 crc kubenswrapper[4775]: I0321 05:37:32.483908 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:37:32 crc kubenswrapper[4775]: I0321 05:37:32.483952 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" gracePeriod=600 Mar 21 05:37:32 crc kubenswrapper[4775]: E0321 05:37:32.631108 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:37:33 crc kubenswrapper[4775]: I0321 05:37:33.451288 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" exitCode=0 Mar 21 05:37:33 crc kubenswrapper[4775]: I0321 05:37:33.451397 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9"} Mar 21 05:37:33 crc kubenswrapper[4775]: I0321 05:37:33.451897 4775 scope.go:117] "RemoveContainer" containerID="ea40506310b7606d65a9c28447427b7d1714319295a97c7d0f6f7a3b4b8251ee" Mar 21 05:37:33 crc kubenswrapper[4775]: I0321 05:37:33.452843 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:37:33 crc kubenswrapper[4775]: E0321 05:37:33.453166 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:37:44 crc kubenswrapper[4775]: I0321 05:37:44.661477 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:37:44 crc kubenswrapper[4775]: E0321 05:37:44.662603 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:37:57 crc kubenswrapper[4775]: I0321 05:37:57.667767 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:37:57 crc kubenswrapper[4775]: E0321 05:37:57.668690 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.146668 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567858-n842g"] Mar 21 05:38:00 crc kubenswrapper[4775]: E0321 05:38:00.147485 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="extract-utilities" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.147502 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="extract-utilities" Mar 21 05:38:00 crc kubenswrapper[4775]: E0321 05:38:00.147522 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b49c68-9fa7-444c-b3cd-4bef5eff4765" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.147529 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b49c68-9fa7-444c-b3cd-4bef5eff4765" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4775]: E0321 05:38:00.147542 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="extract-content" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.147549 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="extract-content" Mar 21 05:38:00 crc kubenswrapper[4775]: E0321 05:38:00.147561 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="registry-server" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.147566 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="registry-server" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.147761 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b49c68-9fa7-444c-b3cd-4bef5eff4765" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.147790 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7563d1d2-9c8c-4ae9-8748-cf7f6f2503c4" containerName="registry-server" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.148568 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-n842g" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.153775 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.153873 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.154445 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.159485 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-n842g"] Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.323599 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qkt\" (UniqueName: \"kubernetes.io/projected/b9b806cf-43d2-4304-8508-8f5af524cb37-kube-api-access-c2qkt\") pod \"auto-csr-approver-29567858-n842g\" (UID: \"b9b806cf-43d2-4304-8508-8f5af524cb37\") " pod="openshift-infra/auto-csr-approver-29567858-n842g" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.425677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qkt\" (UniqueName: \"kubernetes.io/projected/b9b806cf-43d2-4304-8508-8f5af524cb37-kube-api-access-c2qkt\") pod \"auto-csr-approver-29567858-n842g\" (UID: \"b9b806cf-43d2-4304-8508-8f5af524cb37\") " pod="openshift-infra/auto-csr-approver-29567858-n842g" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.445243 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qkt\" (UniqueName: \"kubernetes.io/projected/b9b806cf-43d2-4304-8508-8f5af524cb37-kube-api-access-c2qkt\") pod \"auto-csr-approver-29567858-n842g\" (UID: \"b9b806cf-43d2-4304-8508-8f5af524cb37\") " pod="openshift-infra/auto-csr-approver-29567858-n842g" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.496775 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-n842g" Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.925444 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-n842g"] Mar 21 05:38:00 crc kubenswrapper[4775]: I0321 05:38:00.934031 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:38:01 crc kubenswrapper[4775]: I0321 05:38:01.730553 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-n842g" event={"ID":"b9b806cf-43d2-4304-8508-8f5af524cb37","Type":"ContainerStarted","Data":"067d93dbd9bcbf0db5009f8884065250c82b7e0029e12b5809002041baf60978"} Mar 21 05:38:02 crc kubenswrapper[4775]: I0321 05:38:02.744133 4775 generic.go:334] "Generic (PLEG): container finished" podID="b9b806cf-43d2-4304-8508-8f5af524cb37" containerID="111e8b4e076f4628dda53e32e64a39739272ad46a623d86e91f2a428452f6fb0" exitCode=0 Mar 21 05:38:02 crc kubenswrapper[4775]: I0321 05:38:02.744259 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-n842g" event={"ID":"b9b806cf-43d2-4304-8508-8f5af524cb37","Type":"ContainerDied","Data":"111e8b4e076f4628dda53e32e64a39739272ad46a623d86e91f2a428452f6fb0"} Mar 21 05:38:04 crc kubenswrapper[4775]: I0321 05:38:04.212011 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-n842g" Mar 21 05:38:04 crc kubenswrapper[4775]: I0321 05:38:04.409561 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2qkt\" (UniqueName: \"kubernetes.io/projected/b9b806cf-43d2-4304-8508-8f5af524cb37-kube-api-access-c2qkt\") pod \"b9b806cf-43d2-4304-8508-8f5af524cb37\" (UID: \"b9b806cf-43d2-4304-8508-8f5af524cb37\") " Mar 21 05:38:04 crc kubenswrapper[4775]: I0321 05:38:04.417436 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b806cf-43d2-4304-8508-8f5af524cb37-kube-api-access-c2qkt" (OuterVolumeSpecName: "kube-api-access-c2qkt") pod "b9b806cf-43d2-4304-8508-8f5af524cb37" (UID: "b9b806cf-43d2-4304-8508-8f5af524cb37"). InnerVolumeSpecName "kube-api-access-c2qkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:04 crc kubenswrapper[4775]: I0321 05:38:04.512647 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2qkt\" (UniqueName: \"kubernetes.io/projected/b9b806cf-43d2-4304-8508-8f5af524cb37-kube-api-access-c2qkt\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:04 crc kubenswrapper[4775]: I0321 05:38:04.762515 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-n842g" event={"ID":"b9b806cf-43d2-4304-8508-8f5af524cb37","Type":"ContainerDied","Data":"067d93dbd9bcbf0db5009f8884065250c82b7e0029e12b5809002041baf60978"} Mar 21 05:38:04 crc kubenswrapper[4775]: I0321 05:38:04.762584 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067d93dbd9bcbf0db5009f8884065250c82b7e0029e12b5809002041baf60978" Mar 21 05:38:04 crc kubenswrapper[4775]: I0321 05:38:04.762593 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-n842g" Mar 21 05:38:05 crc kubenswrapper[4775]: I0321 05:38:05.285106 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-hd7nd"] Mar 21 05:38:05 crc kubenswrapper[4775]: I0321 05:38:05.297522 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-hd7nd"] Mar 21 05:38:05 crc kubenswrapper[4775]: I0321 05:38:05.671566 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15017759-960b-4d6e-8ce9-deef2cb94155" path="/var/lib/kubelet/pods/15017759-960b-4d6e-8ce9-deef2cb94155/volumes" Mar 21 05:38:10 crc kubenswrapper[4775]: I0321 05:38:10.661767 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:38:10 crc kubenswrapper[4775]: E0321 05:38:10.662263 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:38:21 crc kubenswrapper[4775]: I0321 05:38:21.664257 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:38:21 crc kubenswrapper[4775]: E0321 05:38:21.666095 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:38:35 crc kubenswrapper[4775]: I0321 05:38:35.661232 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:38:35 crc kubenswrapper[4775]: E0321 05:38:35.662343 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:38:48 crc kubenswrapper[4775]: I0321 05:38:48.332038 4775 scope.go:117] "RemoveContainer" containerID="25828aaca134034962b25d7ea45e1870919d75c0c4923758d122b6c7bce2397c" Mar 21 05:38:49 crc kubenswrapper[4775]: I0321 05:38:49.661244 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:38:49 crc kubenswrapper[4775]: E0321 05:38:49.661580 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:39:03 crc kubenswrapper[4775]: I0321 05:39:03.662261 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:39:03 crc kubenswrapper[4775]: E0321 05:39:03.663111 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:39:16 crc kubenswrapper[4775]: I0321 05:39:16.662366 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:39:16 crc kubenswrapper[4775]: E0321 05:39:16.663880 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:39:29 crc kubenswrapper[4775]: I0321 05:39:29.661165 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:39:29 crc kubenswrapper[4775]: E0321 05:39:29.661900 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:39:42 crc kubenswrapper[4775]: I0321 05:39:42.662194 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:39:42 crc kubenswrapper[4775]: E0321 05:39:42.663146 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:39:57 crc kubenswrapper[4775]: I0321 05:39:57.667381 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:39:57 crc kubenswrapper[4775]: E0321 05:39:57.668260 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.156456 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567860-f4gsq"] Mar 21 05:40:00 crc kubenswrapper[4775]: E0321 05:40:00.157469 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b806cf-43d2-4304-8508-8f5af524cb37" containerName="oc" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.157559 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b806cf-43d2-4304-8508-8f5af524cb37" containerName="oc" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.158013 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b806cf-43d2-4304-8508-8f5af524cb37" containerName="oc" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.158825 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-f4gsq" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.161219 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.161371 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.161392 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.167656 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-f4gsq"] Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.312646 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg6k\" (UniqueName: \"kubernetes.io/projected/42fbc24f-3528-4778-ae85-23e0c2fe99d9-kube-api-access-nhg6k\") pod \"auto-csr-approver-29567860-f4gsq\" (UID: \"42fbc24f-3528-4778-ae85-23e0c2fe99d9\") " pod="openshift-infra/auto-csr-approver-29567860-f4gsq" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.414511 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg6k\" (UniqueName: \"kubernetes.io/projected/42fbc24f-3528-4778-ae85-23e0c2fe99d9-kube-api-access-nhg6k\") pod \"auto-csr-approver-29567860-f4gsq\" (UID: \"42fbc24f-3528-4778-ae85-23e0c2fe99d9\") " pod="openshift-infra/auto-csr-approver-29567860-f4gsq" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.450700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg6k\" (UniqueName: \"kubernetes.io/projected/42fbc24f-3528-4778-ae85-23e0c2fe99d9-kube-api-access-nhg6k\") pod \"auto-csr-approver-29567860-f4gsq\" (UID: \"42fbc24f-3528-4778-ae85-23e0c2fe99d9\") " pod="openshift-infra/auto-csr-approver-29567860-f4gsq" Mar 21 05:40:00 crc kubenswrapper[4775]: I0321 05:40:00.500312 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-f4gsq" Mar 21 05:40:01 crc kubenswrapper[4775]: I0321 05:40:01.000539 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-f4gsq"] Mar 21 05:40:01 crc kubenswrapper[4775]: I0321 05:40:01.026450 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-f4gsq" event={"ID":"42fbc24f-3528-4778-ae85-23e0c2fe99d9","Type":"ContainerStarted","Data":"6760cd7c21187c27e823e38f6472df2e0c87e6b8a85e2a72d82e6594ddfe9fc2"} Mar 21 05:40:04 crc kubenswrapper[4775]: I0321 05:40:04.052681 4775 generic.go:334] "Generic (PLEG): container finished" podID="42fbc24f-3528-4778-ae85-23e0c2fe99d9" containerID="772f50d7983f92c55a24cd77459f228985481da63abdb797f0dc2f295541ac61" exitCode=0 Mar 21 05:40:04 crc kubenswrapper[4775]: I0321 05:40:04.052776 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-f4gsq" event={"ID":"42fbc24f-3528-4778-ae85-23e0c2fe99d9","Type":"ContainerDied","Data":"772f50d7983f92c55a24cd77459f228985481da63abdb797f0dc2f295541ac61"} Mar 21 05:40:05 crc kubenswrapper[4775]: I0321 05:40:05.424684 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-f4gsq" Mar 21 05:40:05 crc kubenswrapper[4775]: I0321 05:40:05.449680 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhg6k\" (UniqueName: \"kubernetes.io/projected/42fbc24f-3528-4778-ae85-23e0c2fe99d9-kube-api-access-nhg6k\") pod \"42fbc24f-3528-4778-ae85-23e0c2fe99d9\" (UID: \"42fbc24f-3528-4778-ae85-23e0c2fe99d9\") " Mar 21 05:40:05 crc kubenswrapper[4775]: I0321 05:40:05.459666 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fbc24f-3528-4778-ae85-23e0c2fe99d9-kube-api-access-nhg6k" (OuterVolumeSpecName: "kube-api-access-nhg6k") pod "42fbc24f-3528-4778-ae85-23e0c2fe99d9" (UID: "42fbc24f-3528-4778-ae85-23e0c2fe99d9"). InnerVolumeSpecName "kube-api-access-nhg6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:40:05 crc kubenswrapper[4775]: I0321 05:40:05.552355 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhg6k\" (UniqueName: \"kubernetes.io/projected/42fbc24f-3528-4778-ae85-23e0c2fe99d9-kube-api-access-nhg6k\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:06 crc kubenswrapper[4775]: I0321 05:40:06.070969 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-f4gsq" event={"ID":"42fbc24f-3528-4778-ae85-23e0c2fe99d9","Type":"ContainerDied","Data":"6760cd7c21187c27e823e38f6472df2e0c87e6b8a85e2a72d82e6594ddfe9fc2"} Mar 21 05:40:06 crc kubenswrapper[4775]: I0321 05:40:06.071347 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6760cd7c21187c27e823e38f6472df2e0c87e6b8a85e2a72d82e6594ddfe9fc2" Mar 21 05:40:06 crc kubenswrapper[4775]: I0321 05:40:06.071038 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-f4gsq" Mar 21 05:40:06 crc kubenswrapper[4775]: I0321 05:40:06.530183 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-gxf82"] Mar 21 05:40:06 crc kubenswrapper[4775]: I0321 05:40:06.545047 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-gxf82"] Mar 21 05:40:07 crc kubenswrapper[4775]: I0321 05:40:07.673000 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afdb9c39-b2aa-496c-8cfb-917fd0b1cc15" path="/var/lib/kubelet/pods/afdb9c39-b2aa-496c-8cfb-917fd0b1cc15/volumes" Mar 21 05:40:10 crc kubenswrapper[4775]: I0321 05:40:10.661343 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:40:10 crc kubenswrapper[4775]: E0321 05:40:10.661890 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:40:25 crc kubenswrapper[4775]: I0321 05:40:25.661013 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:40:25 crc kubenswrapper[4775]: E0321 05:40:25.661880 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:40:37 crc kubenswrapper[4775]: I0321 05:40:37.688358 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:40:37 crc kubenswrapper[4775]: E0321 05:40:37.689415 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.676578 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-59krg"] Mar 21 05:40:47 crc kubenswrapper[4775]: E0321 05:40:47.677536 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fbc24f-3528-4778-ae85-23e0c2fe99d9" containerName="oc" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.677558 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fbc24f-3528-4778-ae85-23e0c2fe99d9" containerName="oc" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.677866 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fbc24f-3528-4778-ae85-23e0c2fe99d9" containerName="oc" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.679848 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.686672 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59krg"] Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.745866 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-catalog-content\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.746197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-utilities\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.746344 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghxxm\" (UniqueName: \"kubernetes.io/projected/c64d253c-50fa-492f-a3df-47eca811f134-kube-api-access-ghxxm\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.848450 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-catalog-content\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.848601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-utilities\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.848649 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghxxm\" (UniqueName: \"kubernetes.io/projected/c64d253c-50fa-492f-a3df-47eca811f134-kube-api-access-ghxxm\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.848937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-catalog-content\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.849057 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-utilities\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:47 crc kubenswrapper[4775]: I0321 05:40:47.904285 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghxxm\" (UniqueName: \"kubernetes.io/projected/c64d253c-50fa-492f-a3df-47eca811f134-kube-api-access-ghxxm\") pod \"redhat-operators-59krg\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:48 crc kubenswrapper[4775]: I0321 05:40:48.011153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:40:48 crc kubenswrapper[4775]: I0321 05:40:48.421320 4775 scope.go:117] "RemoveContainer" containerID="20714a401828b9647e317e22423f1e191dc985cb13d154d0fa57a94896564896" Mar 21 05:40:48 crc kubenswrapper[4775]: I0321 05:40:48.496759 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59krg"] Mar 21 05:40:49 crc kubenswrapper[4775]: I0321 05:40:49.479537 4775 generic.go:334] "Generic (PLEG): container finished" podID="c64d253c-50fa-492f-a3df-47eca811f134" containerID="700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37" exitCode=0 Mar 21 05:40:49 crc kubenswrapper[4775]: I0321 05:40:49.479684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59krg" event={"ID":"c64d253c-50fa-492f-a3df-47eca811f134","Type":"ContainerDied","Data":"700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37"} Mar 21 05:40:49 crc kubenswrapper[4775]: I0321 05:40:49.480937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59krg" event={"ID":"c64d253c-50fa-492f-a3df-47eca811f134","Type":"ContainerStarted","Data":"77c20c16df362ce9ee895d366c0649f371e0ba1c56bfd5984e139390c537c4cc"} Mar 21 05:40:52 crc kubenswrapper[4775]: I0321 05:40:52.511679 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59krg" event={"ID":"c64d253c-50fa-492f-a3df-47eca811f134","Type":"ContainerStarted","Data":"a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38"} Mar 21 05:40:52 crc kubenswrapper[4775]: I0321 05:40:52.661649 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:40:52 crc kubenswrapper[4775]: E0321 05:40:52.662247 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:41:04 crc kubenswrapper[4775]: I0321 05:41:04.661824 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:41:04 crc kubenswrapper[4775]: E0321 05:41:04.662620 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:41:08 crc kubenswrapper[4775]: I0321 05:41:08.651458 4775 generic.go:334] "Generic (PLEG): container finished" podID="c64d253c-50fa-492f-a3df-47eca811f134" containerID="a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38" exitCode=0 Mar 21 05:41:08 crc kubenswrapper[4775]: I0321 05:41:08.651535 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59krg" event={"ID":"c64d253c-50fa-492f-a3df-47eca811f134","Type":"ContainerDied","Data":"a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38"} Mar 21 05:41:09 crc kubenswrapper[4775]: I0321 05:41:09.677641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59krg" event={"ID":"c64d253c-50fa-492f-a3df-47eca811f134","Type":"ContainerStarted","Data":"aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda"} Mar 21 05:41:09 crc kubenswrapper[4775]: I0321 05:41:09.707812 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-59krg" podStartSLOduration=3.079393263 podStartE2EDuration="22.707791283s" podCreationTimestamp="2026-03-21 05:40:47 +0000 UTC" firstStartedPulling="2026-03-21 05:40:49.482383295 +0000 UTC m=+3202.458846919" lastFinishedPulling="2026-03-21 05:41:09.110781315 +0000 UTC m=+3222.087244939" observedRunningTime="2026-03-21 05:41:09.700647599 +0000 UTC m=+3222.677111223" watchObservedRunningTime="2026-03-21 05:41:09.707791283 +0000 UTC m=+3222.684254907" Mar 21 05:41:18 crc kubenswrapper[4775]: I0321 05:41:18.011628 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:41:18 crc kubenswrapper[4775]: I0321 05:41:18.012564 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:41:18 crc kubenswrapper[4775]: I0321 05:41:18.066100 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:41:18 crc kubenswrapper[4775]: I0321 05:41:18.793057 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:41:18 crc kubenswrapper[4775]: I0321 05:41:18.860323 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59krg"] Mar 21 05:41:19 crc kubenswrapper[4775]: I0321 05:41:19.662147 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:41:19 crc kubenswrapper[4775]: E0321 05:41:19.662608 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:41:20 crc kubenswrapper[4775]: I0321 05:41:20.761795 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-59krg" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="registry-server" containerID="cri-o://aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda" gracePeriod=2 Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.431470 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.479338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghxxm\" (UniqueName: \"kubernetes.io/projected/c64d253c-50fa-492f-a3df-47eca811f134-kube-api-access-ghxxm\") pod \"c64d253c-50fa-492f-a3df-47eca811f134\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.479483 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-catalog-content\") pod \"c64d253c-50fa-492f-a3df-47eca811f134\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.479575 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-utilities\") pod \"c64d253c-50fa-492f-a3df-47eca811f134\" (UID: \"c64d253c-50fa-492f-a3df-47eca811f134\") " Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.480982 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-utilities" (OuterVolumeSpecName: "utilities") pod "c64d253c-50fa-492f-a3df-47eca811f134" (UID: "c64d253c-50fa-492f-a3df-47eca811f134"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.486244 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64d253c-50fa-492f-a3df-47eca811f134-kube-api-access-ghxxm" (OuterVolumeSpecName: "kube-api-access-ghxxm") pod "c64d253c-50fa-492f-a3df-47eca811f134" (UID: "c64d253c-50fa-492f-a3df-47eca811f134"). InnerVolumeSpecName "kube-api-access-ghxxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.581785 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.581817 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghxxm\" (UniqueName: \"kubernetes.io/projected/c64d253c-50fa-492f-a3df-47eca811f134-kube-api-access-ghxxm\") on node \"crc\" DevicePath \"\"" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.622422 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c64d253c-50fa-492f-a3df-47eca811f134" (UID: "c64d253c-50fa-492f-a3df-47eca811f134"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.683957 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c64d253c-50fa-492f-a3df-47eca811f134-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.772147 4775 generic.go:334] "Generic (PLEG): container finished" podID="c64d253c-50fa-492f-a3df-47eca811f134" containerID="aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda" exitCode=0 Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.772207 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59krg" event={"ID":"c64d253c-50fa-492f-a3df-47eca811f134","Type":"ContainerDied","Data":"aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda"} Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.772284 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59krg" event={"ID":"c64d253c-50fa-492f-a3df-47eca811f134","Type":"ContainerDied","Data":"77c20c16df362ce9ee895d366c0649f371e0ba1c56bfd5984e139390c537c4cc"} Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.772227 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59krg" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.772307 4775 scope.go:117] "RemoveContainer" containerID="aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.800142 4775 scope.go:117] "RemoveContainer" containerID="a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.807946 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59krg"] Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.817733 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-59krg"] Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.824380 4775 scope.go:117] "RemoveContainer" containerID="700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.858997 4775 scope.go:117] "RemoveContainer" containerID="aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda" Mar 21 05:41:21 crc kubenswrapper[4775]: E0321 05:41:21.859570 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda\": container with ID starting with aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda not found: ID does not exist" containerID="aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.859603 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda"} err="failed to get container status \"aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda\": rpc error: code = NotFound desc = could not find container \"aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda\": container with ID starting with aa6271b27072a7e8897fe55d4458fbaffe65abbbda75fce97d2afa7d3ec31cda not found: ID does not exist" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.859642 4775 scope.go:117] "RemoveContainer" containerID="a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38" Mar 21 05:41:21 crc kubenswrapper[4775]: E0321 05:41:21.859993 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38\": container with ID starting with a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38 not found: ID does not exist" containerID="a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.860023 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38"} err="failed to get container status \"a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38\": rpc error: code = NotFound desc = could not find container \"a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38\": container with ID starting with a3dfd9b083b6a3d2298d5af105729159a72ee6cc98844f82a4efca8eeecdcf38 not found: ID does not exist" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.860044 4775 scope.go:117] "RemoveContainer" containerID="700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37" Mar 21 05:41:21 crc kubenswrapper[4775]: E0321 05:41:21.860558 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37\": container with ID starting with 700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37 not found: ID does not exist" containerID="700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37" Mar 21 05:41:21 crc kubenswrapper[4775]: I0321 05:41:21.860698 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37"} err="failed to get container status \"700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37\": rpc error: code = NotFound desc = could not find container \"700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37\": container with ID starting with 700a025de9fada119b93b8005462e4c55304278415f6ec6840a7bd2dd4fe6a37 not found: ID does not exist" Mar 21 05:41:23 crc kubenswrapper[4775]: I0321 05:41:23.672791 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64d253c-50fa-492f-a3df-47eca811f134" path="/var/lib/kubelet/pods/c64d253c-50fa-492f-a3df-47eca811f134/volumes" Mar 21 05:41:30 crc kubenswrapper[4775]: I0321 05:41:30.661985 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:41:30 crc kubenswrapper[4775]: E0321 05:41:30.662829 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:41:44 crc kubenswrapper[4775]: I0321 05:41:44.661480 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:41:44 crc kubenswrapper[4775]: E0321 05:41:44.662353 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:41:57 crc kubenswrapper[4775]: I0321 05:41:57.667472 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:41:57 crc kubenswrapper[4775]: E0321 05:41:57.668298 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.194864 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567862-bbnhp"] Mar 21 05:42:00 crc kubenswrapper[4775]: E0321 05:42:00.195862 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="extract-content" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.195878 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="extract-content" Mar 21 05:42:00 crc kubenswrapper[4775]: E0321 05:42:00.195926 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="extract-utilities" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.195936 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="extract-utilities" Mar 21 05:42:00 crc kubenswrapper[4775]: E0321 05:42:00.195953 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="registry-server" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.195959 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="registry-server" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.196176 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64d253c-50fa-492f-a3df-47eca811f134" containerName="registry-server" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.196808 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.204456 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.205266 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.205355 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.210480 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-bbnhp"] Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.317790 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5snjv\" (UniqueName: \"kubernetes.io/projected/89f5316a-d6df-4c37-b878-4f3ef82d837b-kube-api-access-5snjv\") pod \"auto-csr-approver-29567862-bbnhp\" (UID: \"89f5316a-d6df-4c37-b878-4f3ef82d837b\") " pod="openshift-infra/auto-csr-approver-29567862-bbnhp" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.420502 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5snjv\" (UniqueName: \"kubernetes.io/projected/89f5316a-d6df-4c37-b878-4f3ef82d837b-kube-api-access-5snjv\") pod \"auto-csr-approver-29567862-bbnhp\" (UID: \"89f5316a-d6df-4c37-b878-4f3ef82d837b\") " pod="openshift-infra/auto-csr-approver-29567862-bbnhp" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.448417 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5snjv\" (UniqueName: \"kubernetes.io/projected/89f5316a-d6df-4c37-b878-4f3ef82d837b-kube-api-access-5snjv\") pod \"auto-csr-approver-29567862-bbnhp\" (UID: \"89f5316a-d6df-4c37-b878-4f3ef82d837b\") " pod="openshift-infra/auto-csr-approver-29567862-bbnhp" Mar 21 05:42:00 crc kubenswrapper[4775]: I0321 05:42:00.880918 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" Mar 21 05:42:01 crc kubenswrapper[4775]: I0321 05:42:01.452380 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-bbnhp"] Mar 21 05:42:02 crc kubenswrapper[4775]: I0321 05:42:02.343621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" event={"ID":"89f5316a-d6df-4c37-b878-4f3ef82d837b","Type":"ContainerStarted","Data":"112eb22a8eb1e402751c2cac58926bd161046e0f023e9a78cfdde548ce731b30"} Mar 21 05:42:03 crc kubenswrapper[4775]: I0321 05:42:03.354859 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" event={"ID":"89f5316a-d6df-4c37-b878-4f3ef82d837b","Type":"ContainerStarted","Data":"5c92c80d8920da595f41beeabde7d0200200f2dfb0bec0c216c4b59513f9ce40"} Mar 21 05:42:03 crc kubenswrapper[4775]: I0321 05:42:03.404842 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" podStartSLOduration=1.861614872 podStartE2EDuration="3.404802468s" podCreationTimestamp="2026-03-21 05:42:00 +0000 UTC" firstStartedPulling="2026-03-21 05:42:01.460242244 +0000 UTC m=+3274.436705858" lastFinishedPulling="2026-03-21 05:42:03.00342983 +0000 UTC m=+3275.979893454" observedRunningTime="2026-03-21 05:42:03.393450564 +0000 UTC m=+3276.369914198" watchObservedRunningTime="2026-03-21 05:42:03.404802468 +0000 UTC m=+3276.381266092" Mar 21 05:42:04 crc kubenswrapper[4775]: I0321 05:42:04.363057 4775 generic.go:334] "Generic (PLEG): container finished" podID="89f5316a-d6df-4c37-b878-4f3ef82d837b" containerID="5c92c80d8920da595f41beeabde7d0200200f2dfb0bec0c216c4b59513f9ce40" exitCode=0 Mar 21 05:42:04 crc kubenswrapper[4775]: I0321 05:42:04.363222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" event={"ID":"89f5316a-d6df-4c37-b878-4f3ef82d837b","Type":"ContainerDied","Data":"5c92c80d8920da595f41beeabde7d0200200f2dfb0bec0c216c4b59513f9ce40"} Mar 21 05:42:05 crc kubenswrapper[4775]: I0321 05:42:05.825909 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" Mar 21 05:42:05 crc kubenswrapper[4775]: I0321 05:42:05.935336 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5snjv\" (UniqueName: \"kubernetes.io/projected/89f5316a-d6df-4c37-b878-4f3ef82d837b-kube-api-access-5snjv\") pod \"89f5316a-d6df-4c37-b878-4f3ef82d837b\" (UID: \"89f5316a-d6df-4c37-b878-4f3ef82d837b\") " Mar 21 05:42:05 crc kubenswrapper[4775]: I0321 05:42:05.942691 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f5316a-d6df-4c37-b878-4f3ef82d837b-kube-api-access-5snjv" (OuterVolumeSpecName: "kube-api-access-5snjv") pod "89f5316a-d6df-4c37-b878-4f3ef82d837b" (UID: "89f5316a-d6df-4c37-b878-4f3ef82d837b"). InnerVolumeSpecName "kube-api-access-5snjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:42:06 crc kubenswrapper[4775]: I0321 05:42:06.039342 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5snjv\" (UniqueName: \"kubernetes.io/projected/89f5316a-d6df-4c37-b878-4f3ef82d837b-kube-api-access-5snjv\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:06 crc kubenswrapper[4775]: I0321 05:42:06.385465 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" event={"ID":"89f5316a-d6df-4c37-b878-4f3ef82d837b","Type":"ContainerDied","Data":"112eb22a8eb1e402751c2cac58926bd161046e0f023e9a78cfdde548ce731b30"} Mar 21 05:42:06 crc kubenswrapper[4775]: I0321 05:42:06.385953 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="112eb22a8eb1e402751c2cac58926bd161046e0f023e9a78cfdde548ce731b30" Mar 21 05:42:06 crc kubenswrapper[4775]: I0321 05:42:06.385705 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-bbnhp" Mar 21 05:42:06 crc kubenswrapper[4775]: I0321 05:42:06.898618 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-vjs55"] Mar 21 05:42:06 crc kubenswrapper[4775]: I0321 05:42:06.908315 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-vjs55"] Mar 21 05:42:07 crc kubenswrapper[4775]: I0321 05:42:07.672350 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b49c68-9fa7-444c-b3cd-4bef5eff4765" path="/var/lib/kubelet/pods/d0b49c68-9fa7-444c-b3cd-4bef5eff4765/volumes" Mar 21 05:42:11 crc kubenswrapper[4775]: I0321 05:42:11.662724 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:42:11 crc kubenswrapper[4775]: E0321 05:42:11.663807 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:42:26 crc kubenswrapper[4775]: I0321 05:42:26.661958 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:42:26 crc kubenswrapper[4775]: E0321 05:42:26.662872 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:42:38 crc kubenswrapper[4775]: I0321 05:42:38.661524 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:42:39 crc kubenswrapper[4775]: I0321 05:42:39.716582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"27384562242ad4f001715c32b1c4d680359f25816c963e4f026f07e0e28e85a2"} Mar 21 05:42:48 crc kubenswrapper[4775]: I0321 05:42:48.541333 4775 scope.go:117] "RemoveContainer" containerID="fe0d6a170f19e7078dcaeb489e7d9b4bba7aef3e240f5e59424f9dea5b0d382f" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.144810 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567864-zsztj"] Mar 21 05:44:00 crc kubenswrapper[4775]: E0321 05:44:00.145872 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f5316a-d6df-4c37-b878-4f3ef82d837b" containerName="oc" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.145888 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f5316a-d6df-4c37-b878-4f3ef82d837b" containerName="oc" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.146080 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f5316a-d6df-4c37-b878-4f3ef82d837b" containerName="oc" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.146861 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-zsztj" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.149733 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.150411 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.150606 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.158409 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-zsztj"] Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.316484 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmtp\" (UniqueName: \"kubernetes.io/projected/3565ae50-5b89-4775-a8c2-45209b86619f-kube-api-access-zhmtp\") pod \"auto-csr-approver-29567864-zsztj\" (UID: \"3565ae50-5b89-4775-a8c2-45209b86619f\") " pod="openshift-infra/auto-csr-approver-29567864-zsztj" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.418339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmtp\" (UniqueName: \"kubernetes.io/projected/3565ae50-5b89-4775-a8c2-45209b86619f-kube-api-access-zhmtp\") pod \"auto-csr-approver-29567864-zsztj\" (UID: \"3565ae50-5b89-4775-a8c2-45209b86619f\") " pod="openshift-infra/auto-csr-approver-29567864-zsztj" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.441378 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmtp\" (UniqueName: \"kubernetes.io/projected/3565ae50-5b89-4775-a8c2-45209b86619f-kube-api-access-zhmtp\") pod \"auto-csr-approver-29567864-zsztj\" (UID: \"3565ae50-5b89-4775-a8c2-45209b86619f\") " pod="openshift-infra/auto-csr-approver-29567864-zsztj" Mar 21 05:44:00 crc kubenswrapper[4775]: I0321 05:44:00.470350 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-zsztj" Mar 21 05:44:01 crc kubenswrapper[4775]: I0321 05:44:01.151599 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-zsztj"] Mar 21 05:44:01 crc kubenswrapper[4775]: W0321 05:44:01.161380 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3565ae50_5b89_4775_a8c2_45209b86619f.slice/crio-6a46803ca1a7c435f38c1681d6ceba3928009c92f6a9af994162f97a780be6c1 WatchSource:0}: Error finding container 6a46803ca1a7c435f38c1681d6ceba3928009c92f6a9af994162f97a780be6c1: Status 404 returned error can't find the container with id 6a46803ca1a7c435f38c1681d6ceba3928009c92f6a9af994162f97a780be6c1 Mar 21 05:44:01 crc kubenswrapper[4775]: I0321 05:44:01.163729 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:44:01 crc kubenswrapper[4775]: I0321 05:44:01.857501 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567864-zsztj" event={"ID":"3565ae50-5b89-4775-a8c2-45209b86619f","Type":"ContainerStarted","Data":"6a46803ca1a7c435f38c1681d6ceba3928009c92f6a9af994162f97a780be6c1"} Mar 21 05:44:02 crc kubenswrapper[4775]: I0321 05:44:02.870195 4775 generic.go:334] "Generic (PLEG): container finished" podID="3565ae50-5b89-4775-a8c2-45209b86619f" containerID="dbc52e5cee44a0750d2409abd45a8584ff150c5de4624d45b211f61e09e87efd" exitCode=0 Mar 21 05:44:02 crc kubenswrapper[4775]: I0321 05:44:02.870393 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567864-zsztj" event={"ID":"3565ae50-5b89-4775-a8c2-45209b86619f","Type":"ContainerDied","Data":"dbc52e5cee44a0750d2409abd45a8584ff150c5de4624d45b211f61e09e87efd"} Mar 21 05:44:04 crc kubenswrapper[4775]: I0321 05:44:04.395545 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-zsztj" Mar 21 05:44:04 crc kubenswrapper[4775]: I0321 05:44:04.406553 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhmtp\" (UniqueName: \"kubernetes.io/projected/3565ae50-5b89-4775-a8c2-45209b86619f-kube-api-access-zhmtp\") pod \"3565ae50-5b89-4775-a8c2-45209b86619f\" (UID: \"3565ae50-5b89-4775-a8c2-45209b86619f\") " Mar 21 05:44:04 crc kubenswrapper[4775]: I0321 05:44:04.420330 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3565ae50-5b89-4775-a8c2-45209b86619f-kube-api-access-zhmtp" (OuterVolumeSpecName: "kube-api-access-zhmtp") pod "3565ae50-5b89-4775-a8c2-45209b86619f" (UID: "3565ae50-5b89-4775-a8c2-45209b86619f"). InnerVolumeSpecName "kube-api-access-zhmtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:44:04 crc kubenswrapper[4775]: I0321 05:44:04.509048 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhmtp\" (UniqueName: \"kubernetes.io/projected/3565ae50-5b89-4775-a8c2-45209b86619f-kube-api-access-zhmtp\") on node \"crc\" DevicePath \"\"" Mar 21 05:44:04 crc kubenswrapper[4775]: I0321 05:44:04.888492 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567864-zsztj" event={"ID":"3565ae50-5b89-4775-a8c2-45209b86619f","Type":"ContainerDied","Data":"6a46803ca1a7c435f38c1681d6ceba3928009c92f6a9af994162f97a780be6c1"} Mar 21 05:44:04 crc kubenswrapper[4775]: I0321 05:44:04.888540 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a46803ca1a7c435f38c1681d6ceba3928009c92f6a9af994162f97a780be6c1" Mar 21 05:44:04 crc kubenswrapper[4775]: I0321 05:44:04.888578 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567864-zsztj" Mar 21 05:44:05 crc kubenswrapper[4775]: I0321 05:44:05.478549 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-n842g"] Mar 21 05:44:05 crc kubenswrapper[4775]: I0321 05:44:05.490696 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-n842g"] Mar 21 05:44:05 crc kubenswrapper[4775]: I0321 05:44:05.671422 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b806cf-43d2-4304-8508-8f5af524cb37" path="/var/lib/kubelet/pods/b9b806cf-43d2-4304-8508-8f5af524cb37/volumes" Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.760807 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xpg9p"] Mar 21 05:44:19 crc kubenswrapper[4775]: E0321 05:44:19.761662 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3565ae50-5b89-4775-a8c2-45209b86619f" containerName="oc" Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.761680 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3565ae50-5b89-4775-a8c2-45209b86619f" containerName="oc" Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.761941 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3565ae50-5b89-4775-a8c2-45209b86619f" containerName="oc" Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.763771 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.791315 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpg9p"] Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.909267 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjmfs\" (UniqueName: \"kubernetes.io/projected/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-kube-api-access-mjmfs\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.909744 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-catalog-content\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:19 crc kubenswrapper[4775]: I0321 05:44:19.909794 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-utilities\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.011656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-catalog-content\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.011962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-utilities\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.012131 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjmfs\" (UniqueName: \"kubernetes.io/projected/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-kube-api-access-mjmfs\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.012340 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-catalog-content\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.012440 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-utilities\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.038205 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjmfs\" (UniqueName: \"kubernetes.io/projected/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-kube-api-access-mjmfs\") pod \"certified-operators-xpg9p\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.084900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:20 crc kubenswrapper[4775]: I0321 05:44:20.706864 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpg9p"] Mar 21 05:44:21 crc kubenswrapper[4775]: I0321 05:44:21.230765 4775 generic.go:334] "Generic (PLEG): container finished" podID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerID="9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad" exitCode=0 Mar 21 05:44:21 crc kubenswrapper[4775]: I0321 05:44:21.230825 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpg9p" event={"ID":"c328ad3c-41c4-419e-88ea-8bcd758dcb9e","Type":"ContainerDied","Data":"9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad"} Mar 21 05:44:21 crc kubenswrapper[4775]: I0321 05:44:21.231053 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpg9p" event={"ID":"c328ad3c-41c4-419e-88ea-8bcd758dcb9e","Type":"ContainerStarted","Data":"1555376c25e05b9026c4e9c7682126d2efcfb8571cf0c2c03216cdf17dee8a8d"} Mar 21 05:44:23 crc kubenswrapper[4775]: I0321 05:44:23.254591 4775 generic.go:334] "Generic (PLEG): container finished" podID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerID="e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf" exitCode=0 Mar 21 05:44:23 crc kubenswrapper[4775]: I0321 05:44:23.254807 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpg9p" event={"ID":"c328ad3c-41c4-419e-88ea-8bcd758dcb9e","Type":"ContainerDied","Data":"e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf"} Mar 21 05:44:25 crc kubenswrapper[4775]: I0321 05:44:25.275977 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpg9p" event={"ID":"c328ad3c-41c4-419e-88ea-8bcd758dcb9e","Type":"ContainerStarted","Data":"a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf"} Mar 21 05:44:25 crc kubenswrapper[4775]: I0321 05:44:25.291701 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xpg9p" podStartSLOduration=3.179261875 podStartE2EDuration="6.291685837s" podCreationTimestamp="2026-03-21 05:44:19 +0000 UTC" firstStartedPulling="2026-03-21 05:44:21.232322256 +0000 UTC m=+3414.208785880" lastFinishedPulling="2026-03-21 05:44:24.344746218 +0000 UTC m=+3417.321209842" observedRunningTime="2026-03-21 05:44:25.290964226 +0000 UTC m=+3418.267427850" watchObservedRunningTime="2026-03-21 05:44:25.291685837 +0000 UTC m=+3418.268149461" Mar 21 05:44:30 crc kubenswrapper[4775]: I0321 05:44:30.086830 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:30 crc kubenswrapper[4775]: I0321 05:44:30.087553 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:30 crc kubenswrapper[4775]: I0321 05:44:30.137839 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:30 crc kubenswrapper[4775]: I0321 05:44:30.368276 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:30 crc kubenswrapper[4775]: I0321 05:44:30.425272 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpg9p"] Mar 21 05:44:32 crc kubenswrapper[4775]: I0321 05:44:32.334850 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xpg9p" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="registry-server" containerID="cri-o://a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf" gracePeriod=2 Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.283676 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.424842 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-utilities\") pod \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.425000 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-catalog-content\") pod \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.425065 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjmfs\" (UniqueName: \"kubernetes.io/projected/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-kube-api-access-mjmfs\") pod \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\" (UID: \"c328ad3c-41c4-419e-88ea-8bcd758dcb9e\") " Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.426340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-utilities" (OuterVolumeSpecName: "utilities") pod "c328ad3c-41c4-419e-88ea-8bcd758dcb9e" (UID: "c328ad3c-41c4-419e-88ea-8bcd758dcb9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.429206 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.430402 4775 generic.go:334] "Generic (PLEG): container finished" podID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerID="a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf" exitCode=0 Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.430434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpg9p" event={"ID":"c328ad3c-41c4-419e-88ea-8bcd758dcb9e","Type":"ContainerDied","Data":"a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf"} Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.430533 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpg9p" event={"ID":"c328ad3c-41c4-419e-88ea-8bcd758dcb9e","Type":"ContainerDied","Data":"1555376c25e05b9026c4e9c7682126d2efcfb8571cf0c2c03216cdf17dee8a8d"} Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.430555 4775 scope.go:117] "RemoveContainer" containerID="a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.430756 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpg9p" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.447774 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-kube-api-access-mjmfs" (OuterVolumeSpecName: "kube-api-access-mjmfs") pod "c328ad3c-41c4-419e-88ea-8bcd758dcb9e" (UID: "c328ad3c-41c4-419e-88ea-8bcd758dcb9e"). InnerVolumeSpecName "kube-api-access-mjmfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.502629 4775 scope.go:117] "RemoveContainer" containerID="e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.511974 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c328ad3c-41c4-419e-88ea-8bcd758dcb9e" (UID: "c328ad3c-41c4-419e-88ea-8bcd758dcb9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.521355 4775 scope.go:117] "RemoveContainer" containerID="9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.530923 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.531157 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjmfs\" (UniqueName: \"kubernetes.io/projected/c328ad3c-41c4-419e-88ea-8bcd758dcb9e-kube-api-access-mjmfs\") on node \"crc\" DevicePath \"\"" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.598358 4775 scope.go:117] "RemoveContainer" containerID="a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf" Mar 21 05:44:33 crc kubenswrapper[4775]: E0321 05:44:33.600590 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf\": container with ID starting with a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf not found: ID does not exist" containerID="a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.600701 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf"} err="failed to get container status \"a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf\": rpc error: code = NotFound desc = could not find container \"a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf\": container with ID starting with a5d4a873d066150b522769e8838cafe2ee9965846f80ed997ebe148b5055e6bf not found: ID does not exist" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.600801 4775 scope.go:117] "RemoveContainer" containerID="e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf" Mar 21 05:44:33 crc kubenswrapper[4775]: E0321 05:44:33.601077 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf\": container with ID starting with e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf not found: ID does not exist" containerID="e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.601188 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf"} err="failed to get container status \"e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf\": rpc error: code = NotFound desc = could not find container \"e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf\": container with ID starting with e5c7b15bf2740b46601cb7a23aa579db3b92a3ad5e9f07f7746cfac8bf93d1bf not found: ID does not exist" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.601281 4775 scope.go:117] "RemoveContainer" containerID="9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad" Mar 21 05:44:33 crc kubenswrapper[4775]: E0321 05:44:33.603346 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad\": container with ID starting with 9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad not found: ID does not exist" containerID="9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.603404 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad"} err="failed to get container status \"9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad\": rpc error: code = NotFound desc = could not find container \"9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad\": container with ID starting with 9fb0be2f280c6c43746b5afba600c3d334f8a86d80cb113c8262eb021a4d9dad not found: ID does not exist" Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.757341 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpg9p"] Mar 21 05:44:33 crc kubenswrapper[4775]: I0321 05:44:33.769574 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xpg9p"] Mar 21 05:44:35 crc kubenswrapper[4775]: I0321 05:44:35.670847 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" path="/var/lib/kubelet/pods/c328ad3c-41c4-419e-88ea-8bcd758dcb9e/volumes" Mar 21 05:44:48 crc kubenswrapper[4775]: I0321 05:44:48.655790 4775 scope.go:117] "RemoveContainer" containerID="111e8b4e076f4628dda53e32e64a39739272ad46a623d86e91f2a428452f6fb0" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.150113 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm"] Mar 21 05:45:00 crc kubenswrapper[4775]: E0321 05:45:00.151093 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="extract-content" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.151107 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="extract-content" Mar 21 05:45:00 crc kubenswrapper[4775]: E0321 05:45:00.151153 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="extract-utilities" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.151160 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="extract-utilities" Mar 21 05:45:00 crc kubenswrapper[4775]: E0321 05:45:00.151178 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="registry-server" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.151184 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="registry-server" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.151354 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c328ad3c-41c4-419e-88ea-8bcd758dcb9e" containerName="registry-server" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.152035 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.155147 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.155793 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.164768 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm"] Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.274091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnmvv\" (UniqueName: \"kubernetes.io/projected/853b684f-d127-48b8-9dd6-5eed4edccfe6-kube-api-access-hnmvv\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.274310 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/853b684f-d127-48b8-9dd6-5eed4edccfe6-secret-volume\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.274390 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/853b684f-d127-48b8-9dd6-5eed4edccfe6-config-volume\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.376526 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/853b684f-d127-48b8-9dd6-5eed4edccfe6-secret-volume\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.376672 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/853b684f-d127-48b8-9dd6-5eed4edccfe6-config-volume\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.376826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnmvv\" (UniqueName: \"kubernetes.io/projected/853b684f-d127-48b8-9dd6-5eed4edccfe6-kube-api-access-hnmvv\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.378161 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/853b684f-d127-48b8-9dd6-5eed4edccfe6-config-volume\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.384214 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/853b684f-d127-48b8-9dd6-5eed4edccfe6-secret-volume\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.394609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnmvv\" (UniqueName: \"kubernetes.io/projected/853b684f-d127-48b8-9dd6-5eed4edccfe6-kube-api-access-hnmvv\") pod \"collect-profiles-29567865-dtkzm\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.484871 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:00 crc kubenswrapper[4775]: I0321 05:45:00.964317 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm"] Mar 21 05:45:01 crc kubenswrapper[4775]: I0321 05:45:01.145180 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" event={"ID":"853b684f-d127-48b8-9dd6-5eed4edccfe6","Type":"ContainerStarted","Data":"1b4c8b22e4aa0409d0b120e79806d2e1fd24b5cce756221596dd023358442451"} Mar 21 05:45:01 crc kubenswrapper[4775]: I0321 05:45:01.145249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" event={"ID":"853b684f-d127-48b8-9dd6-5eed4edccfe6","Type":"ContainerStarted","Data":"a47c618d2b09bdc0b59e70299cd03eb578c080155feeb1dd527de0cec62c45e6"} Mar 21 05:45:01 crc kubenswrapper[4775]: I0321 05:45:01.167795 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" podStartSLOduration=1.167775832 podStartE2EDuration="1.167775832s" podCreationTimestamp="2026-03-21 05:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:45:01.161628117 +0000 UTC m=+3454.138091741" watchObservedRunningTime="2026-03-21 05:45:01.167775832 +0000 UTC m=+3454.144239456" Mar 21 05:45:02 crc kubenswrapper[4775]: I0321 05:45:02.157853 4775 generic.go:334] "Generic (PLEG): container finished" podID="853b684f-d127-48b8-9dd6-5eed4edccfe6" containerID="1b4c8b22e4aa0409d0b120e79806d2e1fd24b5cce756221596dd023358442451" exitCode=0 Mar 21 05:45:02 crc kubenswrapper[4775]: I0321 05:45:02.157940 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" event={"ID":"853b684f-d127-48b8-9dd6-5eed4edccfe6","Type":"ContainerDied","Data":"1b4c8b22e4aa0409d0b120e79806d2e1fd24b5cce756221596dd023358442451"} Mar 21 05:45:02 crc kubenswrapper[4775]: I0321 05:45:02.482974 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:45:02 crc kubenswrapper[4775]: I0321 05:45:02.483064 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.563616 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.648279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnmvv\" (UniqueName: \"kubernetes.io/projected/853b684f-d127-48b8-9dd6-5eed4edccfe6-kube-api-access-hnmvv\") pod \"853b684f-d127-48b8-9dd6-5eed4edccfe6\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.648339 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/853b684f-d127-48b8-9dd6-5eed4edccfe6-config-volume\") pod \"853b684f-d127-48b8-9dd6-5eed4edccfe6\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.648426 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/853b684f-d127-48b8-9dd6-5eed4edccfe6-secret-volume\") pod \"853b684f-d127-48b8-9dd6-5eed4edccfe6\" (UID: \"853b684f-d127-48b8-9dd6-5eed4edccfe6\") " Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.649968 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/853b684f-d127-48b8-9dd6-5eed4edccfe6-config-volume" (OuterVolumeSpecName: "config-volume") pod "853b684f-d127-48b8-9dd6-5eed4edccfe6" (UID: "853b684f-d127-48b8-9dd6-5eed4edccfe6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.656652 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/853b684f-d127-48b8-9dd6-5eed4edccfe6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "853b684f-d127-48b8-9dd6-5eed4edccfe6" (UID: "853b684f-d127-48b8-9dd6-5eed4edccfe6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.656695 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/853b684f-d127-48b8-9dd6-5eed4edccfe6-kube-api-access-hnmvv" (OuterVolumeSpecName: "kube-api-access-hnmvv") pod "853b684f-d127-48b8-9dd6-5eed4edccfe6" (UID: "853b684f-d127-48b8-9dd6-5eed4edccfe6"). InnerVolumeSpecName "kube-api-access-hnmvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.750679 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/853b684f-d127-48b8-9dd6-5eed4edccfe6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.751045 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnmvv\" (UniqueName: \"kubernetes.io/projected/853b684f-d127-48b8-9dd6-5eed4edccfe6-kube-api-access-hnmvv\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:03 crc kubenswrapper[4775]: I0321 05:45:03.751316 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/853b684f-d127-48b8-9dd6-5eed4edccfe6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:45:04 crc kubenswrapper[4775]: I0321 05:45:04.181795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" event={"ID":"853b684f-d127-48b8-9dd6-5eed4edccfe6","Type":"ContainerDied","Data":"a47c618d2b09bdc0b59e70299cd03eb578c080155feeb1dd527de0cec62c45e6"} Mar 21 05:45:04 crc kubenswrapper[4775]: I0321 05:45:04.182166 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a47c618d2b09bdc0b59e70299cd03eb578c080155feeb1dd527de0cec62c45e6" Mar 21 05:45:04 crc kubenswrapper[4775]: I0321 05:45:04.181896 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567865-dtkzm" Mar 21 05:45:04 crc kubenswrapper[4775]: I0321 05:45:04.240202 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq"] Mar 21 05:45:04 crc kubenswrapper[4775]: I0321 05:45:04.249546 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-pcqhq"] Mar 21 05:45:05 crc kubenswrapper[4775]: I0321 05:45:05.696290 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="421d78dc-59bb-4b5b-9738-1eb6a6144b38" path="/var/lib/kubelet/pods/421d78dc-59bb-4b5b-9738-1eb6a6144b38/volumes" Mar 21 05:45:32 crc kubenswrapper[4775]: I0321 05:45:32.481867 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:45:32 crc kubenswrapper[4775]: I0321 05:45:32.482410 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:45:48 crc kubenswrapper[4775]: I0321 05:45:48.751864 4775 scope.go:117] "RemoveContainer" containerID="19febf6fd2202a1482d38e4d840812756bb31244e9ab9826c5c08685c9d1d70a" Mar 21 05:45:59 crc kubenswrapper[4775]: I0321 05:45:59.973555 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rq7vc"] Mar 21 05:45:59 crc kubenswrapper[4775]: E0321 05:45:59.974468 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="853b684f-d127-48b8-9dd6-5eed4edccfe6" containerName="collect-profiles" Mar 21 05:45:59 crc kubenswrapper[4775]: I0321 05:45:59.974484 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="853b684f-d127-48b8-9dd6-5eed4edccfe6" containerName="collect-profiles" Mar 21 05:45:59 crc kubenswrapper[4775]: I0321 05:45:59.974694 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="853b684f-d127-48b8-9dd6-5eed4edccfe6" containerName="collect-profiles" Mar 21 05:45:59 crc kubenswrapper[4775]: I0321 05:45:59.976231 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:45:59 crc kubenswrapper[4775]: I0321 05:45:59.996797 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rq7vc"] Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.064718 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-catalog-content\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.065104 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htrdd\" (UniqueName: \"kubernetes.io/projected/ad7dafc3-787f-416c-8d24-e14205d49ef9-kube-api-access-htrdd\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.065223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-utilities\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.148395 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567866-r9qrw"] Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.150076 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-r9qrw" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.154005 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.155226 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.156559 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.159706 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-r9qrw"] Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.166740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htrdd\" (UniqueName: \"kubernetes.io/projected/ad7dafc3-787f-416c-8d24-e14205d49ef9-kube-api-access-htrdd\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.166814 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-utilities\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.166934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-catalog-content\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.167634 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-catalog-content\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.167814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-utilities\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.199394 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htrdd\" (UniqueName: \"kubernetes.io/projected/ad7dafc3-787f-416c-8d24-e14205d49ef9-kube-api-access-htrdd\") pod \"community-operators-rq7vc\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.268882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rncg7\" (UniqueName: \"kubernetes.io/projected/52968d61-f2a8-4bcd-9e97-cb01c508a52e-kube-api-access-rncg7\") pod \"auto-csr-approver-29567866-r9qrw\" (UID: \"52968d61-f2a8-4bcd-9e97-cb01c508a52e\") " pod="openshift-infra/auto-csr-approver-29567866-r9qrw" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.303331 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.370747 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rncg7\" (UniqueName: \"kubernetes.io/projected/52968d61-f2a8-4bcd-9e97-cb01c508a52e-kube-api-access-rncg7\") pod \"auto-csr-approver-29567866-r9qrw\" (UID: \"52968d61-f2a8-4bcd-9e97-cb01c508a52e\") " pod="openshift-infra/auto-csr-approver-29567866-r9qrw" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.394988 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rncg7\" (UniqueName: \"kubernetes.io/projected/52968d61-f2a8-4bcd-9e97-cb01c508a52e-kube-api-access-rncg7\") pod \"auto-csr-approver-29567866-r9qrw\" (UID: \"52968d61-f2a8-4bcd-9e97-cb01c508a52e\") " pod="openshift-infra/auto-csr-approver-29567866-r9qrw" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.472786 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-r9qrw" Mar 21 05:46:00 crc kubenswrapper[4775]: I0321 05:46:00.916885 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rq7vc"] Mar 21 05:46:01 crc kubenswrapper[4775]: I0321 05:46:01.062608 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-r9qrw"] Mar 21 05:46:01 crc kubenswrapper[4775]: W0321 05:46:01.066175 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52968d61_f2a8_4bcd_9e97_cb01c508a52e.slice/crio-80eeb92ee1ea5558f25c003260f13ac5085ae991a234da84daaed0e1fb2960b3 WatchSource:0}: Error finding container 80eeb92ee1ea5558f25c003260f13ac5085ae991a234da84daaed0e1fb2960b3: Status 404 returned error can't find the container with id 80eeb92ee1ea5558f25c003260f13ac5085ae991a234da84daaed0e1fb2960b3 Mar 21 05:46:01 crc kubenswrapper[4775]: I0321 05:46:01.718779 4775 generic.go:334] "Generic (PLEG): container finished" podID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerID="8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b" exitCode=0 Mar 21 05:46:01 crc kubenswrapper[4775]: I0321 05:46:01.718885 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rq7vc" event={"ID":"ad7dafc3-787f-416c-8d24-e14205d49ef9","Type":"ContainerDied","Data":"8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b"} Mar 21 05:46:01 crc kubenswrapper[4775]: I0321 05:46:01.718959 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rq7vc" event={"ID":"ad7dafc3-787f-416c-8d24-e14205d49ef9","Type":"ContainerStarted","Data":"1a44081b04cb6ba41f124a0f5e6ff0b10d25dc91b013064ce6b7e927c43304be"} Mar 21 05:46:01 crc kubenswrapper[4775]: I0321 05:46:01.720076 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567866-r9qrw" event={"ID":"52968d61-f2a8-4bcd-9e97-cb01c508a52e","Type":"ContainerStarted","Data":"80eeb92ee1ea5558f25c003260f13ac5085ae991a234da84daaed0e1fb2960b3"} Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.482464 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.483063 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.483109 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.483906 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27384562242ad4f001715c32b1c4d680359f25816c963e4f026f07e0e28e85a2"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.483956 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://27384562242ad4f001715c32b1c4d680359f25816c963e4f026f07e0e28e85a2" gracePeriod=600 Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.729419 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="27384562242ad4f001715c32b1c4d680359f25816c963e4f026f07e0e28e85a2" exitCode=0 Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.729497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"27384562242ad4f001715c32b1c4d680359f25816c963e4f026f07e0e28e85a2"} Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.729551 4775 scope.go:117] "RemoveContainer" containerID="f0c9b510088c7d635cf52734c112efecde67f87d29c60e9903dd8c0901a91ef9" Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.735669 4775 generic.go:334] "Generic (PLEG): container finished" podID="52968d61-f2a8-4bcd-9e97-cb01c508a52e" containerID="875a527a3f14334ab80b89530523916cfe6a1d4008ed65204aa4a0f63ca04790" exitCode=0 Mar 21 05:46:02 crc kubenswrapper[4775]: I0321 05:46:02.735780 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567866-r9qrw" event={"ID":"52968d61-f2a8-4bcd-9e97-cb01c508a52e","Type":"ContainerDied","Data":"875a527a3f14334ab80b89530523916cfe6a1d4008ed65204aa4a0f63ca04790"} Mar 21 05:46:03 crc kubenswrapper[4775]: I0321 05:46:03.749924 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rq7vc" event={"ID":"ad7dafc3-787f-416c-8d24-e14205d49ef9","Type":"ContainerStarted","Data":"91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8"} Mar 21 05:46:03 crc kubenswrapper[4775]: I0321 05:46:03.752837 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2"} Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.176379 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-r9qrw" Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.270882 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rncg7\" (UniqueName: \"kubernetes.io/projected/52968d61-f2a8-4bcd-9e97-cb01c508a52e-kube-api-access-rncg7\") pod \"52968d61-f2a8-4bcd-9e97-cb01c508a52e\" (UID: \"52968d61-f2a8-4bcd-9e97-cb01c508a52e\") " Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.277623 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52968d61-f2a8-4bcd-9e97-cb01c508a52e-kube-api-access-rncg7" (OuterVolumeSpecName: "kube-api-access-rncg7") pod "52968d61-f2a8-4bcd-9e97-cb01c508a52e" (UID: "52968d61-f2a8-4bcd-9e97-cb01c508a52e"). InnerVolumeSpecName "kube-api-access-rncg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.373074 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rncg7\" (UniqueName: \"kubernetes.io/projected/52968d61-f2a8-4bcd-9e97-cb01c508a52e-kube-api-access-rncg7\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.762813 4775 generic.go:334] "Generic (PLEG): container finished" podID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerID="91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8" exitCode=0 Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.762861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rq7vc" event={"ID":"ad7dafc3-787f-416c-8d24-e14205d49ef9","Type":"ContainerDied","Data":"91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8"} Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.764676 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567866-r9qrw" Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.764648 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567866-r9qrw" event={"ID":"52968d61-f2a8-4bcd-9e97-cb01c508a52e","Type":"ContainerDied","Data":"80eeb92ee1ea5558f25c003260f13ac5085ae991a234da84daaed0e1fb2960b3"} Mar 21 05:46:04 crc kubenswrapper[4775]: I0321 05:46:04.764741 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80eeb92ee1ea5558f25c003260f13ac5085ae991a234da84daaed0e1fb2960b3" Mar 21 05:46:05 crc kubenswrapper[4775]: I0321 05:46:05.247151 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-f4gsq"] Mar 21 05:46:05 crc kubenswrapper[4775]: I0321 05:46:05.257547 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-f4gsq"] Mar 21 05:46:05 crc kubenswrapper[4775]: I0321 05:46:05.672331 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fbc24f-3528-4778-ae85-23e0c2fe99d9" path="/var/lib/kubelet/pods/42fbc24f-3528-4778-ae85-23e0c2fe99d9/volumes" Mar 21 05:46:05 crc kubenswrapper[4775]: I0321 05:46:05.776202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rq7vc" event={"ID":"ad7dafc3-787f-416c-8d24-e14205d49ef9","Type":"ContainerStarted","Data":"05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366"} Mar 21 05:46:05 crc kubenswrapper[4775]: I0321 05:46:05.794704 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rq7vc" podStartSLOduration=3.294294484 podStartE2EDuration="6.794690562s" podCreationTimestamp="2026-03-21 05:45:59 +0000 UTC" firstStartedPulling="2026-03-21 05:46:01.720686624 +0000 UTC m=+3514.697150248" lastFinishedPulling="2026-03-21 05:46:05.221082702 +0000 UTC m=+3518.197546326" observedRunningTime="2026-03-21 05:46:05.794091125 +0000 UTC m=+3518.770554749" watchObservedRunningTime="2026-03-21 05:46:05.794690562 +0000 UTC m=+3518.771154186" Mar 21 05:46:10 crc kubenswrapper[4775]: I0321 05:46:10.304703 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:10 crc kubenswrapper[4775]: I0321 05:46:10.306669 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:10 crc kubenswrapper[4775]: I0321 05:46:10.411347 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:10 crc kubenswrapper[4775]: I0321 05:46:10.888316 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:10 crc kubenswrapper[4775]: I0321 05:46:10.942735 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rq7vc"] Mar 21 05:46:12 crc kubenswrapper[4775]: I0321 05:46:12.844353 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rq7vc" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="registry-server" containerID="cri-o://05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366" gracePeriod=2 Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.373005 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.453526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-catalog-content\") pod \"ad7dafc3-787f-416c-8d24-e14205d49ef9\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.453576 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htrdd\" (UniqueName: \"kubernetes.io/projected/ad7dafc3-787f-416c-8d24-e14205d49ef9-kube-api-access-htrdd\") pod \"ad7dafc3-787f-416c-8d24-e14205d49ef9\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.453657 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-utilities\") pod \"ad7dafc3-787f-416c-8d24-e14205d49ef9\" (UID: \"ad7dafc3-787f-416c-8d24-e14205d49ef9\") " Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.454771 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-utilities" (OuterVolumeSpecName: "utilities") pod "ad7dafc3-787f-416c-8d24-e14205d49ef9" (UID: "ad7dafc3-787f-416c-8d24-e14205d49ef9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.460192 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7dafc3-787f-416c-8d24-e14205d49ef9-kube-api-access-htrdd" (OuterVolumeSpecName: "kube-api-access-htrdd") pod "ad7dafc3-787f-416c-8d24-e14205d49ef9" (UID: "ad7dafc3-787f-416c-8d24-e14205d49ef9"). InnerVolumeSpecName "kube-api-access-htrdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.520991 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad7dafc3-787f-416c-8d24-e14205d49ef9" (UID: "ad7dafc3-787f-416c-8d24-e14205d49ef9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.556327 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.556367 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htrdd\" (UniqueName: \"kubernetes.io/projected/ad7dafc3-787f-416c-8d24-e14205d49ef9-kube-api-access-htrdd\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.556376 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad7dafc3-787f-416c-8d24-e14205d49ef9-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.855490 4775 generic.go:334] "Generic (PLEG): container finished" podID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerID="05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366" exitCode=0 Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.855588 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rq7vc" event={"ID":"ad7dafc3-787f-416c-8d24-e14205d49ef9","Type":"ContainerDied","Data":"05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366"} Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.855880 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rq7vc" event={"ID":"ad7dafc3-787f-416c-8d24-e14205d49ef9","Type":"ContainerDied","Data":"1a44081b04cb6ba41f124a0f5e6ff0b10d25dc91b013064ce6b7e927c43304be"} Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.855905 4775 scope.go:117] "RemoveContainer" containerID="05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.855595 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rq7vc" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.884087 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rq7vc"] Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.886710 4775 scope.go:117] "RemoveContainer" containerID="91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.894889 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rq7vc"] Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.911525 4775 scope.go:117] "RemoveContainer" containerID="8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.963752 4775 scope.go:117] "RemoveContainer" containerID="05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366" Mar 21 05:46:13 crc kubenswrapper[4775]: E0321 05:46:13.964355 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366\": container with ID starting with 05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366 not found: ID does not exist" containerID="05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.964425 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366"} err="failed to get container status \"05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366\": rpc error: code = NotFound desc = could not find container \"05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366\": container with ID starting with 05a691707ef3970b49493cbe9973459894a5f44eb36bfdb928a95355b3928366 not found: ID does not exist" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.964464 4775 scope.go:117] "RemoveContainer" containerID="91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8" Mar 21 05:46:13 crc kubenswrapper[4775]: E0321 05:46:13.964802 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8\": container with ID starting with 91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8 not found: ID does not exist" containerID="91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.964835 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8"} err="failed to get container status \"91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8\": rpc error: code = NotFound desc = could not find container \"91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8\": container with ID starting with 91e8f06f511abc06a57cb2ea3ef15bf4b3dcdd6588f6404ed6c313f7cdc50de8 not found: ID does not exist" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.964857 4775 scope.go:117] "RemoveContainer" containerID="8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b" Mar 21 05:46:13 crc kubenswrapper[4775]: E0321 05:46:13.965378 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b\": container with ID starting with 8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b not found: ID does not exist" containerID="8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b" Mar 21 05:46:13 crc kubenswrapper[4775]: I0321 05:46:13.965454 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b"} err="failed to get container status \"8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b\": rpc error: code = NotFound desc = could not find container \"8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b\": container with ID starting with 8dc9d5d632ab88270b647b5baf2266422cef8bce7289c3c70bdb45fd5569924b not found: ID does not exist" Mar 21 05:46:15 crc kubenswrapper[4775]: I0321 05:46:15.672799 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" path="/var/lib/kubelet/pods/ad7dafc3-787f-416c-8d24-e14205d49ef9/volumes" Mar 21 05:46:24 crc kubenswrapper[4775]: I0321 05:46:24.981695 4775 generic.go:334] "Generic (PLEG): container finished" podID="1c832898-838d-423d-8ad8-512c5ee5706c" containerID="e5d5a6749ca728ee30628ed8c23d0df8c8193d878227278bf224646abd9c8806" exitCode=0 Mar 21 05:46:24 crc kubenswrapper[4775]: I0321 05:46:24.982287 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1c832898-838d-423d-8ad8-512c5ee5706c","Type":"ContainerDied","Data":"e5d5a6749ca728ee30628ed8c23d0df8c8193d878227278bf224646abd9c8806"} Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.377985 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.505963 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-config-data\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506129 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ca-certs\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506206 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-workdir\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506289 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-temporary\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506308 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config-secret\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506385 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdw2v\" (UniqueName: \"kubernetes.io/projected/1c832898-838d-423d-8ad8-512c5ee5706c-kube-api-access-gdw2v\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506444 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.506507 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ssh-key\") pod \"1c832898-838d-423d-8ad8-512c5ee5706c\" (UID: \"1c832898-838d-423d-8ad8-512c5ee5706c\") " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.507030 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-config-data" (OuterVolumeSpecName: "config-data") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.507647 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.512085 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.512549 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c832898-838d-423d-8ad8-512c5ee5706c-kube-api-access-gdw2v" (OuterVolumeSpecName: "kube-api-access-gdw2v") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "kube-api-access-gdw2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.517389 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.536010 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.540148 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.564474 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.564982 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c832898-838d-423d-8ad8-512c5ee5706c" (UID: "1c832898-838d-423d-8ad8-512c5ee5706c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608461 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608513 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608522 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c832898-838d-423d-8ad8-512c5ee5706c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608531 4775 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608566 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608577 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608586 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1c832898-838d-423d-8ad8-512c5ee5706c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608596 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1c832898-838d-423d-8ad8-512c5ee5706c-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.608605 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdw2v\" (UniqueName: \"kubernetes.io/projected/1c832898-838d-423d-8ad8-512c5ee5706c-kube-api-access-gdw2v\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.627205 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 21 05:46:26 crc kubenswrapper[4775]: I0321 05:46:26.711147 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:46:27 crc kubenswrapper[4775]: I0321 05:46:27.000349 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1c832898-838d-423d-8ad8-512c5ee5706c","Type":"ContainerDied","Data":"695de001f7207ca22144c3ce97e3ea6306a89b43020987a1d4856f3f9e289416"} Mar 21 05:46:27 crc kubenswrapper[4775]: I0321 05:46:27.000407 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="695de001f7207ca22144c3ce97e3ea6306a89b43020987a1d4856f3f9e289416" Mar 21 05:46:27 crc kubenswrapper[4775]: I0321 05:46:27.000458 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.875724 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:46:33 crc kubenswrapper[4775]: E0321 05:46:33.876636 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="extract-content" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.876686 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="extract-content" Mar 21 05:46:33 crc kubenswrapper[4775]: E0321 05:46:33.876722 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="registry-server" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.876732 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="registry-server" Mar 21 05:46:33 crc kubenswrapper[4775]: E0321 05:46:33.876764 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c832898-838d-423d-8ad8-512c5ee5706c" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.876774 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c832898-838d-423d-8ad8-512c5ee5706c" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:46:33 crc kubenswrapper[4775]: E0321 05:46:33.876792 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="extract-utilities" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.876802 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="extract-utilities" Mar 21 05:46:33 crc kubenswrapper[4775]: E0321 05:46:33.876819 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52968d61-f2a8-4bcd-9e97-cb01c508a52e" containerName="oc" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.876828 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="52968d61-f2a8-4bcd-9e97-cb01c508a52e" containerName="oc" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.877085 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="52968d61-f2a8-4bcd-9e97-cb01c508a52e" containerName="oc" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.877142 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7dafc3-787f-416c-8d24-e14205d49ef9" containerName="registry-server" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.877160 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c832898-838d-423d-8ad8-512c5ee5706c" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.878011 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.881100 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bnv6q" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.889592 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.965996 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74n4l\" (UniqueName: \"kubernetes.io/projected/2c99faa4-db71-4a05-a018-9c382f33f55e-kube-api-access-74n4l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2c99faa4-db71-4a05-a018-9c382f33f55e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:33 crc kubenswrapper[4775]: I0321 05:46:33.966428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2c99faa4-db71-4a05-a018-9c382f33f55e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:34 crc kubenswrapper[4775]: I0321 05:46:34.068922 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74n4l\" (UniqueName: \"kubernetes.io/projected/2c99faa4-db71-4a05-a018-9c382f33f55e-kube-api-access-74n4l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2c99faa4-db71-4a05-a018-9c382f33f55e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:34 crc kubenswrapper[4775]: I0321 05:46:34.069043 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2c99faa4-db71-4a05-a018-9c382f33f55e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:34 crc kubenswrapper[4775]: I0321 05:46:34.069568 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2c99faa4-db71-4a05-a018-9c382f33f55e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:34 crc kubenswrapper[4775]: I0321 05:46:34.092836 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74n4l\" (UniqueName: \"kubernetes.io/projected/2c99faa4-db71-4a05-a018-9c382f33f55e-kube-api-access-74n4l\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2c99faa4-db71-4a05-a018-9c382f33f55e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:34 crc kubenswrapper[4775]: I0321 05:46:34.101171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2c99faa4-db71-4a05-a018-9c382f33f55e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:34 crc kubenswrapper[4775]: I0321 05:46:34.210873 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:46:34 crc kubenswrapper[4775]: I0321 05:46:34.650412 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:46:35 crc kubenswrapper[4775]: I0321 05:46:35.105690 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2c99faa4-db71-4a05-a018-9c382f33f55e","Type":"ContainerStarted","Data":"a1b6e5eb529eddaf4e17d335f3dd4d0c4c8307feaa5af921a19809433863ab62"} Mar 21 05:46:36 crc kubenswrapper[4775]: I0321 05:46:36.115911 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2c99faa4-db71-4a05-a018-9c382f33f55e","Type":"ContainerStarted","Data":"a379f3260971c9b4bf0cc3e889ee5a76416e8393c60c47aa6b68e112ce28ff1e"} Mar 21 05:46:36 crc kubenswrapper[4775]: I0321 05:46:36.132980 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.314587789 podStartE2EDuration="3.13296503s" podCreationTimestamp="2026-03-21 05:46:33 +0000 UTC" firstStartedPulling="2026-03-21 05:46:34.65304146 +0000 UTC m=+3547.629505164" lastFinishedPulling="2026-03-21 05:46:35.471418781 +0000 UTC m=+3548.447882405" observedRunningTime="2026-03-21 05:46:36.130790128 +0000 UTC m=+3549.107253762" watchObservedRunningTime="2026-03-21 05:46:36.13296503 +0000 UTC m=+3549.109428654" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.061891 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mtx69"] Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.068674 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.074950 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtx69"] Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.186680 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-utilities\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.186750 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcts\" (UniqueName: \"kubernetes.io/projected/7fa06424-b2f2-4965-85d6-52f7824c1663-kube-api-access-rfcts\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.186842 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-catalog-content\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.288685 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-utilities\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.288763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcts\" (UniqueName: \"kubernetes.io/projected/7fa06424-b2f2-4965-85d6-52f7824c1663-kube-api-access-rfcts\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.288878 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-catalog-content\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.289456 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-catalog-content\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.289495 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-utilities\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.313654 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcts\" (UniqueName: \"kubernetes.io/projected/7fa06424-b2f2-4965-85d6-52f7824c1663-kube-api-access-rfcts\") pod \"redhat-marketplace-mtx69\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.407578 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:47 crc kubenswrapper[4775]: I0321 05:46:47.917756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtx69"] Mar 21 05:46:47 crc kubenswrapper[4775]: W0321 05:46:47.932138 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa06424_b2f2_4965_85d6_52f7824c1663.slice/crio-adafd4efc773e168282a543dba92bbe9e306c86f368448a6d3525650dc94f58a WatchSource:0}: Error finding container adafd4efc773e168282a543dba92bbe9e306c86f368448a6d3525650dc94f58a: Status 404 returned error can't find the container with id adafd4efc773e168282a543dba92bbe9e306c86f368448a6d3525650dc94f58a Mar 21 05:46:48 crc kubenswrapper[4775]: I0321 05:46:48.250061 4775 generic.go:334] "Generic (PLEG): container finished" podID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerID="8c9a5a3c4fe60a9ba70c2bdb23fc13eb6fbc93c9d639fb16cff54b7b76bd3b9c" exitCode=0 Mar 21 05:46:48 crc kubenswrapper[4775]: I0321 05:46:48.250154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtx69" event={"ID":"7fa06424-b2f2-4965-85d6-52f7824c1663","Type":"ContainerDied","Data":"8c9a5a3c4fe60a9ba70c2bdb23fc13eb6fbc93c9d639fb16cff54b7b76bd3b9c"} Mar 21 05:46:48 crc kubenswrapper[4775]: I0321 05:46:48.250213 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtx69" event={"ID":"7fa06424-b2f2-4965-85d6-52f7824c1663","Type":"ContainerStarted","Data":"adafd4efc773e168282a543dba92bbe9e306c86f368448a6d3525650dc94f58a"} Mar 21 05:46:48 crc kubenswrapper[4775]: I0321 05:46:48.806425 4775 scope.go:117] "RemoveContainer" containerID="772f50d7983f92c55a24cd77459f228985481da63abdb797f0dc2f295541ac61" Mar 21 05:46:49 crc kubenswrapper[4775]: I0321 05:46:49.267775 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtx69" event={"ID":"7fa06424-b2f2-4965-85d6-52f7824c1663","Type":"ContainerStarted","Data":"d0545665cdd8e41c947743dcd9829bdc046d8dd1683e708f87ff1c576ef1ba0f"} Mar 21 05:46:50 crc kubenswrapper[4775]: I0321 05:46:50.279202 4775 generic.go:334] "Generic (PLEG): container finished" podID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerID="d0545665cdd8e41c947743dcd9829bdc046d8dd1683e708f87ff1c576ef1ba0f" exitCode=0 Mar 21 05:46:50 crc kubenswrapper[4775]: I0321 05:46:50.279316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtx69" event={"ID":"7fa06424-b2f2-4965-85d6-52f7824c1663","Type":"ContainerDied","Data":"d0545665cdd8e41c947743dcd9829bdc046d8dd1683e708f87ff1c576ef1ba0f"} Mar 21 05:46:50 crc kubenswrapper[4775]: I0321 05:46:50.279757 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtx69" event={"ID":"7fa06424-b2f2-4965-85d6-52f7824c1663","Type":"ContainerStarted","Data":"82a27cb328425f943a3f5240bbcae1d99975930a49e7720038822aa648d7b9d5"} Mar 21 05:46:50 crc kubenswrapper[4775]: I0321 05:46:50.317490 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mtx69" podStartSLOduration=1.883296724 podStartE2EDuration="3.31747054s" podCreationTimestamp="2026-03-21 05:46:47 +0000 UTC" firstStartedPulling="2026-03-21 05:46:48.25168906 +0000 UTC m=+3561.228152684" lastFinishedPulling="2026-03-21 05:46:49.685862876 +0000 UTC m=+3562.662326500" observedRunningTime="2026-03-21 05:46:50.311085528 +0000 UTC m=+3563.287549162" watchObservedRunningTime="2026-03-21 05:46:50.31747054 +0000 UTC m=+3563.293934164" Mar 21 05:46:57 crc kubenswrapper[4775]: I0321 05:46:57.407974 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:57 crc kubenswrapper[4775]: I0321 05:46:57.408481 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:57 crc kubenswrapper[4775]: I0321 05:46:57.456079 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:46:58 crc kubenswrapper[4775]: I0321 05:46:58.398107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:47:01 crc kubenswrapper[4775]: I0321 05:47:01.025025 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtx69"] Mar 21 05:47:01 crc kubenswrapper[4775]: I0321 05:47:01.025588 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mtx69" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="registry-server" containerID="cri-o://82a27cb328425f943a3f5240bbcae1d99975930a49e7720038822aa648d7b9d5" gracePeriod=2 Mar 21 05:47:01 crc kubenswrapper[4775]: I0321 05:47:01.381738 4775 generic.go:334] "Generic (PLEG): container finished" podID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerID="82a27cb328425f943a3f5240bbcae1d99975930a49e7720038822aa648d7b9d5" exitCode=0 Mar 21 05:47:01 crc kubenswrapper[4775]: I0321 05:47:01.381781 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtx69" event={"ID":"7fa06424-b2f2-4965-85d6-52f7824c1663","Type":"ContainerDied","Data":"82a27cb328425f943a3f5240bbcae1d99975930a49e7720038822aa648d7b9d5"} Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.098254 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.286489 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcts\" (UniqueName: \"kubernetes.io/projected/7fa06424-b2f2-4965-85d6-52f7824c1663-kube-api-access-rfcts\") pod \"7fa06424-b2f2-4965-85d6-52f7824c1663\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.286838 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-catalog-content\") pod \"7fa06424-b2f2-4965-85d6-52f7824c1663\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.287109 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-utilities\") pod \"7fa06424-b2f2-4965-85d6-52f7824c1663\" (UID: \"7fa06424-b2f2-4965-85d6-52f7824c1663\") " Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.287828 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-utilities" (OuterVolumeSpecName: "utilities") pod "7fa06424-b2f2-4965-85d6-52f7824c1663" (UID: "7fa06424-b2f2-4965-85d6-52f7824c1663"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.288091 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.292286 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa06424-b2f2-4965-85d6-52f7824c1663-kube-api-access-rfcts" (OuterVolumeSpecName: "kube-api-access-rfcts") pod "7fa06424-b2f2-4965-85d6-52f7824c1663" (UID: "7fa06424-b2f2-4965-85d6-52f7824c1663"). InnerVolumeSpecName "kube-api-access-rfcts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.317411 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fa06424-b2f2-4965-85d6-52f7824c1663" (UID: "7fa06424-b2f2-4965-85d6-52f7824c1663"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.403201 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcts\" (UniqueName: \"kubernetes.io/projected/7fa06424-b2f2-4965-85d6-52f7824c1663-kube-api-access-rfcts\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.403247 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa06424-b2f2-4965-85d6-52f7824c1663-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.408425 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtx69" event={"ID":"7fa06424-b2f2-4965-85d6-52f7824c1663","Type":"ContainerDied","Data":"adafd4efc773e168282a543dba92bbe9e306c86f368448a6d3525650dc94f58a"} Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.408486 4775 scope.go:117] "RemoveContainer" containerID="82a27cb328425f943a3f5240bbcae1d99975930a49e7720038822aa648d7b9d5" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.408613 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtx69" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.437460 4775 scope.go:117] "RemoveContainer" containerID="d0545665cdd8e41c947743dcd9829bdc046d8dd1683e708f87ff1c576ef1ba0f" Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.448157 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtx69"] Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.455307 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtx69"] Mar 21 05:47:02 crc kubenswrapper[4775]: I0321 05:47:02.458817 4775 scope.go:117] "RemoveContainer" containerID="8c9a5a3c4fe60a9ba70c2bdb23fc13eb6fbc93c9d639fb16cff54b7b76bd3b9c" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.464515 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8t2r/must-gather-znpkx"] Mar 21 05:47:03 crc kubenswrapper[4775]: E0321 05:47:03.465424 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="extract-content" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.465445 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="extract-content" Mar 21 05:47:03 crc kubenswrapper[4775]: E0321 05:47:03.465454 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="extract-utilities" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.465463 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="extract-utilities" Mar 21 05:47:03 crc kubenswrapper[4775]: E0321 05:47:03.465475 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="registry-server" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.465484 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="registry-server" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.465702 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" containerName="registry-server" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.467042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.468496 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b8t2r"/"default-dockercfg-wsz9m" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.470952 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b8t2r"/"openshift-service-ca.crt" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.471767 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b8t2r"/"kube-root-ca.crt" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.491478 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8t2r/must-gather-znpkx"] Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.527227 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd8wn\" (UniqueName: \"kubernetes.io/projected/77cb0b91-097f-4e81-888d-57b2d37399bd-kube-api-access-cd8wn\") pod \"must-gather-znpkx\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.527330 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77cb0b91-097f-4e81-888d-57b2d37399bd-must-gather-output\") pod \"must-gather-znpkx\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.629090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd8wn\" (UniqueName: \"kubernetes.io/projected/77cb0b91-097f-4e81-888d-57b2d37399bd-kube-api-access-cd8wn\") pod \"must-gather-znpkx\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.629192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77cb0b91-097f-4e81-888d-57b2d37399bd-must-gather-output\") pod \"must-gather-znpkx\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.629674 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77cb0b91-097f-4e81-888d-57b2d37399bd-must-gather-output\") pod \"must-gather-znpkx\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.656810 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd8wn\" (UniqueName: \"kubernetes.io/projected/77cb0b91-097f-4e81-888d-57b2d37399bd-kube-api-access-cd8wn\") pod \"must-gather-znpkx\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.673339 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa06424-b2f2-4965-85d6-52f7824c1663" path="/var/lib/kubelet/pods/7fa06424-b2f2-4965-85d6-52f7824c1663/volumes" Mar 21 05:47:03 crc kubenswrapper[4775]: I0321 05:47:03.783869 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:47:04 crc kubenswrapper[4775]: I0321 05:47:04.356672 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8t2r/must-gather-znpkx"] Mar 21 05:47:04 crc kubenswrapper[4775]: I0321 05:47:04.430233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/must-gather-znpkx" event={"ID":"77cb0b91-097f-4e81-888d-57b2d37399bd","Type":"ContainerStarted","Data":"9283ebedf4adbab35d77585b5d52cc34e6dca3a8e4311ff13ca4f606f2d25f69"} Mar 21 05:47:14 crc kubenswrapper[4775]: I0321 05:47:14.519322 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/must-gather-znpkx" event={"ID":"77cb0b91-097f-4e81-888d-57b2d37399bd","Type":"ContainerStarted","Data":"76eca6bf5897fa3d0bd946a5ac8c7c93a3f123f745216d5491ccbb010515a8cc"} Mar 21 05:47:14 crc kubenswrapper[4775]: I0321 05:47:14.519951 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/must-gather-znpkx" event={"ID":"77cb0b91-097f-4e81-888d-57b2d37399bd","Type":"ContainerStarted","Data":"891104e8ee294ddf739928b04e0a1b0e8f6bf51e7a90dfe50eb44a5a6ed19b18"} Mar 21 05:47:14 crc kubenswrapper[4775]: I0321 05:47:14.542243 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8t2r/must-gather-znpkx" podStartSLOduration=2.451127172 podStartE2EDuration="11.542221378s" podCreationTimestamp="2026-03-21 05:47:03 +0000 UTC" firstStartedPulling="2026-03-21 05:47:04.364784907 +0000 UTC m=+3577.341248531" lastFinishedPulling="2026-03-21 05:47:13.455879113 +0000 UTC m=+3586.432342737" observedRunningTime="2026-03-21 05:47:14.534866008 +0000 UTC m=+3587.511329642" watchObservedRunningTime="2026-03-21 05:47:14.542221378 +0000 UTC m=+3587.518685002" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.553094 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-hwlzx"] Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.556893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.586834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnqz\" (UniqueName: \"kubernetes.io/projected/e4c91c96-06cc-4b45-ae11-64b80671e80b-kube-api-access-kqnqz\") pod \"crc-debug-hwlzx\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.587007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c91c96-06cc-4b45-ae11-64b80671e80b-host\") pod \"crc-debug-hwlzx\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.689043 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c91c96-06cc-4b45-ae11-64b80671e80b-host\") pod \"crc-debug-hwlzx\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.689296 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnqz\" (UniqueName: \"kubernetes.io/projected/e4c91c96-06cc-4b45-ae11-64b80671e80b-kube-api-access-kqnqz\") pod \"crc-debug-hwlzx\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.689652 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c91c96-06cc-4b45-ae11-64b80671e80b-host\") pod \"crc-debug-hwlzx\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.718619 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnqz\" (UniqueName: \"kubernetes.io/projected/e4c91c96-06cc-4b45-ae11-64b80671e80b-kube-api-access-kqnqz\") pod \"crc-debug-hwlzx\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: I0321 05:47:18.878539 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:47:18 crc kubenswrapper[4775]: W0321 05:47:18.914826 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c91c96_06cc_4b45_ae11_64b80671e80b.slice/crio-c614da3ddde7ddcfaca9a6477d42e9beb54e92477909095bbe7803e50891019d WatchSource:0}: Error finding container c614da3ddde7ddcfaca9a6477d42e9beb54e92477909095bbe7803e50891019d: Status 404 returned error can't find the container with id c614da3ddde7ddcfaca9a6477d42e9beb54e92477909095bbe7803e50891019d Mar 21 05:47:19 crc kubenswrapper[4775]: I0321 05:47:19.573542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" event={"ID":"e4c91c96-06cc-4b45-ae11-64b80671e80b","Type":"ContainerStarted","Data":"c614da3ddde7ddcfaca9a6477d42e9beb54e92477909095bbe7803e50891019d"} Mar 21 05:47:35 crc kubenswrapper[4775]: E0321 05:47:35.686486 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 21 05:47:35 crc kubenswrapper[4775]: E0321 05:47:35.687254 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kqnqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-hwlzx_openshift-must-gather-b8t2r(e4c91c96-06cc-4b45-ae11-64b80671e80b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:47:35 crc kubenswrapper[4775]: E0321 05:47:35.688379 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" podUID="e4c91c96-06cc-4b45-ae11-64b80671e80b" Mar 21 05:47:35 crc kubenswrapper[4775]: E0321 05:47:35.743756 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" podUID="e4c91c96-06cc-4b45-ae11-64b80671e80b" Mar 21 05:47:48 crc kubenswrapper[4775]: I0321 05:47:48.859523 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" event={"ID":"e4c91c96-06cc-4b45-ae11-64b80671e80b","Type":"ContainerStarted","Data":"59c30d6d373f941e5b2724ff5fbfdf61ac691e4e5c44b0b8bd0b70a7f5049699"} Mar 21 05:47:48 crc kubenswrapper[4775]: I0321 05:47:48.883221 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" podStartSLOduration=1.930702717 podStartE2EDuration="30.88319701s" podCreationTimestamp="2026-03-21 05:47:18 +0000 UTC" firstStartedPulling="2026-03-21 05:47:18.920639969 +0000 UTC m=+3591.897103593" lastFinishedPulling="2026-03-21 05:47:47.873134262 +0000 UTC m=+3620.849597886" observedRunningTime="2026-03-21 05:47:48.881262715 +0000 UTC m=+3621.857726349" watchObservedRunningTime="2026-03-21 05:47:48.88319701 +0000 UTC m=+3621.859660634" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.149011 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567868-8z86d"] Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.151348 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-8z86d" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.153676 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.154984 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.156134 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.163739 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-8z86d"] Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.212786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dsp9\" (UniqueName: \"kubernetes.io/projected/bdeb3ead-3f1f-491f-9a15-f550065d18fb-kube-api-access-5dsp9\") pod \"auto-csr-approver-29567868-8z86d\" (UID: \"bdeb3ead-3f1f-491f-9a15-f550065d18fb\") " pod="openshift-infra/auto-csr-approver-29567868-8z86d" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.315762 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dsp9\" (UniqueName: \"kubernetes.io/projected/bdeb3ead-3f1f-491f-9a15-f550065d18fb-kube-api-access-5dsp9\") pod \"auto-csr-approver-29567868-8z86d\" (UID: \"bdeb3ead-3f1f-491f-9a15-f550065d18fb\") " pod="openshift-infra/auto-csr-approver-29567868-8z86d" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.342156 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dsp9\" (UniqueName: \"kubernetes.io/projected/bdeb3ead-3f1f-491f-9a15-f550065d18fb-kube-api-access-5dsp9\") pod \"auto-csr-approver-29567868-8z86d\" (UID: \"bdeb3ead-3f1f-491f-9a15-f550065d18fb\") " pod="openshift-infra/auto-csr-approver-29567868-8z86d" Mar 21 05:48:00 crc kubenswrapper[4775]: I0321 05:48:00.864613 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-8z86d" Mar 21 05:48:01 crc kubenswrapper[4775]: I0321 05:48:01.352563 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-8z86d"] Mar 21 05:48:02 crc kubenswrapper[4775]: I0321 05:48:02.007330 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567868-8z86d" event={"ID":"bdeb3ead-3f1f-491f-9a15-f550065d18fb","Type":"ContainerStarted","Data":"7d50155bebefa4c1f1d1c9ef73ca60fb016430db6c5e51096e9341267622df9b"} Mar 21 05:48:02 crc kubenswrapper[4775]: I0321 05:48:02.481933 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:48:02 crc kubenswrapper[4775]: I0321 05:48:02.482009 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:48:04 crc kubenswrapper[4775]: I0321 05:48:04.030138 4775 generic.go:334] "Generic (PLEG): container finished" podID="bdeb3ead-3f1f-491f-9a15-f550065d18fb" containerID="6995f01e339d899d20a0310c4264870e5b4687e152b20b7bc6f33c17d3fccaea" exitCode=0 Mar 21 05:48:04 crc kubenswrapper[4775]: I0321 05:48:04.030245 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567868-8z86d" event={"ID":"bdeb3ead-3f1f-491f-9a15-f550065d18fb","Type":"ContainerDied","Data":"6995f01e339d899d20a0310c4264870e5b4687e152b20b7bc6f33c17d3fccaea"} Mar 21 05:48:05 crc kubenswrapper[4775]: I0321 05:48:05.410038 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-8z86d" Mar 21 05:48:05 crc kubenswrapper[4775]: I0321 05:48:05.521543 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dsp9\" (UniqueName: \"kubernetes.io/projected/bdeb3ead-3f1f-491f-9a15-f550065d18fb-kube-api-access-5dsp9\") pod \"bdeb3ead-3f1f-491f-9a15-f550065d18fb\" (UID: \"bdeb3ead-3f1f-491f-9a15-f550065d18fb\") " Mar 21 05:48:05 crc kubenswrapper[4775]: I0321 05:48:05.534343 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdeb3ead-3f1f-491f-9a15-f550065d18fb-kube-api-access-5dsp9" (OuterVolumeSpecName: "kube-api-access-5dsp9") pod "bdeb3ead-3f1f-491f-9a15-f550065d18fb" (UID: "bdeb3ead-3f1f-491f-9a15-f550065d18fb"). InnerVolumeSpecName "kube-api-access-5dsp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:48:05 crc kubenswrapper[4775]: I0321 05:48:05.624028 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dsp9\" (UniqueName: \"kubernetes.io/projected/bdeb3ead-3f1f-491f-9a15-f550065d18fb-kube-api-access-5dsp9\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:06 crc kubenswrapper[4775]: I0321 05:48:06.052178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567868-8z86d" event={"ID":"bdeb3ead-3f1f-491f-9a15-f550065d18fb","Type":"ContainerDied","Data":"7d50155bebefa4c1f1d1c9ef73ca60fb016430db6c5e51096e9341267622df9b"} Mar 21 05:48:06 crc kubenswrapper[4775]: I0321 05:48:06.052218 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d50155bebefa4c1f1d1c9ef73ca60fb016430db6c5e51096e9341267622df9b" Mar 21 05:48:06 crc kubenswrapper[4775]: I0321 05:48:06.052255 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567868-8z86d" Mar 21 05:48:06 crc kubenswrapper[4775]: I0321 05:48:06.494885 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-bbnhp"] Mar 21 05:48:06 crc kubenswrapper[4775]: I0321 05:48:06.504973 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-bbnhp"] Mar 21 05:48:07 crc kubenswrapper[4775]: I0321 05:48:07.670827 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f5316a-d6df-4c37-b878-4f3ef82d837b" path="/var/lib/kubelet/pods/89f5316a-d6df-4c37-b878-4f3ef82d837b/volumes" Mar 21 05:48:26 crc kubenswrapper[4775]: I0321 05:48:26.286230 4775 generic.go:334] "Generic (PLEG): container finished" podID="e4c91c96-06cc-4b45-ae11-64b80671e80b" containerID="59c30d6d373f941e5b2724ff5fbfdf61ac691e4e5c44b0b8bd0b70a7f5049699" exitCode=0 Mar 21 05:48:26 crc kubenswrapper[4775]: I0321 05:48:26.286318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" event={"ID":"e4c91c96-06cc-4b45-ae11-64b80671e80b","Type":"ContainerDied","Data":"59c30d6d373f941e5b2724ff5fbfdf61ac691e4e5c44b0b8bd0b70a7f5049699"} Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.411589 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.451394 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-hwlzx"] Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.459727 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-hwlzx"] Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.486490 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c91c96-06cc-4b45-ae11-64b80671e80b-host\") pod \"e4c91c96-06cc-4b45-ae11-64b80671e80b\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.486614 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4c91c96-06cc-4b45-ae11-64b80671e80b-host" (OuterVolumeSpecName: "host") pod "e4c91c96-06cc-4b45-ae11-64b80671e80b" (UID: "e4c91c96-06cc-4b45-ae11-64b80671e80b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.486640 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnqz\" (UniqueName: \"kubernetes.io/projected/e4c91c96-06cc-4b45-ae11-64b80671e80b-kube-api-access-kqnqz\") pod \"e4c91c96-06cc-4b45-ae11-64b80671e80b\" (UID: \"e4c91c96-06cc-4b45-ae11-64b80671e80b\") " Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.487013 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4c91c96-06cc-4b45-ae11-64b80671e80b-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.491985 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c91c96-06cc-4b45-ae11-64b80671e80b-kube-api-access-kqnqz" (OuterVolumeSpecName: "kube-api-access-kqnqz") pod "e4c91c96-06cc-4b45-ae11-64b80671e80b" (UID: "e4c91c96-06cc-4b45-ae11-64b80671e80b"). InnerVolumeSpecName "kube-api-access-kqnqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.588772 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqnqz\" (UniqueName: \"kubernetes.io/projected/e4c91c96-06cc-4b45-ae11-64b80671e80b-kube-api-access-kqnqz\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:27 crc kubenswrapper[4775]: I0321 05:48:27.678729 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4c91c96-06cc-4b45-ae11-64b80671e80b" path="/var/lib/kubelet/pods/e4c91c96-06cc-4b45-ae11-64b80671e80b/volumes" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.305232 4775 scope.go:117] "RemoveContainer" containerID="59c30d6d373f941e5b2724ff5fbfdf61ac691e4e5c44b0b8bd0b70a7f5049699" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.305284 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-hwlzx" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.614787 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-895n9"] Mar 21 05:48:28 crc kubenswrapper[4775]: E0321 05:48:28.615283 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c91c96-06cc-4b45-ae11-64b80671e80b" containerName="container-00" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.615299 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c91c96-06cc-4b45-ae11-64b80671e80b" containerName="container-00" Mar 21 05:48:28 crc kubenswrapper[4775]: E0321 05:48:28.615326 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeb3ead-3f1f-491f-9a15-f550065d18fb" containerName="oc" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.615337 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeb3ead-3f1f-491f-9a15-f550065d18fb" containerName="oc" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.615634 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeb3ead-3f1f-491f-9a15-f550065d18fb" containerName="oc" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.615666 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c91c96-06cc-4b45-ae11-64b80671e80b" containerName="container-00" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.617108 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.712669 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxz5n\" (UniqueName: \"kubernetes.io/projected/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-kube-api-access-sxz5n\") pod \"crc-debug-895n9\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.712873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-host\") pod \"crc-debug-895n9\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.815245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxz5n\" (UniqueName: \"kubernetes.io/projected/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-kube-api-access-sxz5n\") pod \"crc-debug-895n9\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.815368 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-host\") pod \"crc-debug-895n9\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.815707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-host\") pod \"crc-debug-895n9\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.839274 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxz5n\" (UniqueName: \"kubernetes.io/projected/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-kube-api-access-sxz5n\") pod \"crc-debug-895n9\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: I0321 05:48:28.937267 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:28 crc kubenswrapper[4775]: W0321 05:48:28.966817 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb0eb42_ffda_49af_a49c_d9ecdb7af97f.slice/crio-28871b09be768f444462e0b438b5c3800defbdffe8f0e221f6f09be59d8d2535 WatchSource:0}: Error finding container 28871b09be768f444462e0b438b5c3800defbdffe8f0e221f6f09be59d8d2535: Status 404 returned error can't find the container with id 28871b09be768f444462e0b438b5c3800defbdffe8f0e221f6f09be59d8d2535 Mar 21 05:48:29 crc kubenswrapper[4775]: I0321 05:48:29.317661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-895n9" event={"ID":"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f","Type":"ContainerStarted","Data":"f3085622e93162df966234d5c6f8955f876617fcb1bf4d233a2caaceadb9d60f"} Mar 21 05:48:29 crc kubenswrapper[4775]: I0321 05:48:29.317763 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-895n9" event={"ID":"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f","Type":"ContainerStarted","Data":"28871b09be768f444462e0b438b5c3800defbdffe8f0e221f6f09be59d8d2535"} Mar 21 05:48:29 crc kubenswrapper[4775]: I0321 05:48:29.336072 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8t2r/crc-debug-895n9" podStartSLOduration=1.336038829 podStartE2EDuration="1.336038829s" podCreationTimestamp="2026-03-21 05:48:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:48:29.331765388 +0000 UTC m=+3662.308229032" watchObservedRunningTime="2026-03-21 05:48:29.336038829 +0000 UTC m=+3662.312502493" Mar 21 05:48:30 crc kubenswrapper[4775]: I0321 05:48:30.327977 4775 generic.go:334] "Generic (PLEG): container finished" podID="4fb0eb42-ffda-49af-a49c-d9ecdb7af97f" containerID="f3085622e93162df966234d5c6f8955f876617fcb1bf4d233a2caaceadb9d60f" exitCode=0 Mar 21 05:48:30 crc kubenswrapper[4775]: I0321 05:48:30.328051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-895n9" event={"ID":"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f","Type":"ContainerDied","Data":"f3085622e93162df966234d5c6f8955f876617fcb1bf4d233a2caaceadb9d60f"} Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.440649 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.470579 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-895n9"] Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.478714 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-895n9"] Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.585571 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxz5n\" (UniqueName: \"kubernetes.io/projected/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-kube-api-access-sxz5n\") pod \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.585743 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-host\") pod \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\" (UID: \"4fb0eb42-ffda-49af-a49c-d9ecdb7af97f\") " Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.585894 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-host" (OuterVolumeSpecName: "host") pod "4fb0eb42-ffda-49af-a49c-d9ecdb7af97f" (UID: "4fb0eb42-ffda-49af-a49c-d9ecdb7af97f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.586438 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.594481 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-kube-api-access-sxz5n" (OuterVolumeSpecName: "kube-api-access-sxz5n") pod "4fb0eb42-ffda-49af-a49c-d9ecdb7af97f" (UID: "4fb0eb42-ffda-49af-a49c-d9ecdb7af97f"). InnerVolumeSpecName "kube-api-access-sxz5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.675024 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fb0eb42-ffda-49af-a49c-d9ecdb7af97f" path="/var/lib/kubelet/pods/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f/volumes" Mar 21 05:48:31 crc kubenswrapper[4775]: I0321 05:48:31.688294 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxz5n\" (UniqueName: \"kubernetes.io/projected/4fb0eb42-ffda-49af-a49c-d9ecdb7af97f-kube-api-access-sxz5n\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.347700 4775 scope.go:117] "RemoveContainer" containerID="f3085622e93162df966234d5c6f8955f876617fcb1bf4d233a2caaceadb9d60f" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.347755 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-895n9" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.481952 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.482667 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.642063 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-k6ct4"] Mar 21 05:48:32 crc kubenswrapper[4775]: E0321 05:48:32.642811 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fb0eb42-ffda-49af-a49c-d9ecdb7af97f" containerName="container-00" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.642829 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fb0eb42-ffda-49af-a49c-d9ecdb7af97f" containerName="container-00" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.643014 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fb0eb42-ffda-49af-a49c-d9ecdb7af97f" containerName="container-00" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.643728 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.810506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c04654f3-0f1c-4883-8385-4149f0076d44-host\") pod \"crc-debug-k6ct4\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.811031 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9mn2\" (UniqueName: \"kubernetes.io/projected/c04654f3-0f1c-4883-8385-4149f0076d44-kube-api-access-m9mn2\") pod \"crc-debug-k6ct4\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.913403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c04654f3-0f1c-4883-8385-4149f0076d44-host\") pod \"crc-debug-k6ct4\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.913451 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9mn2\" (UniqueName: \"kubernetes.io/projected/c04654f3-0f1c-4883-8385-4149f0076d44-kube-api-access-m9mn2\") pod \"crc-debug-k6ct4\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.913613 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c04654f3-0f1c-4883-8385-4149f0076d44-host\") pod \"crc-debug-k6ct4\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.934035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9mn2\" (UniqueName: \"kubernetes.io/projected/c04654f3-0f1c-4883-8385-4149f0076d44-kube-api-access-m9mn2\") pod \"crc-debug-k6ct4\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: I0321 05:48:32.961342 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:32 crc kubenswrapper[4775]: W0321 05:48:32.997022 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc04654f3_0f1c_4883_8385_4149f0076d44.slice/crio-5c91c61fdedd7ac3b55f157313b39d5da2758e7741b49a653fbe9b5afd78473b WatchSource:0}: Error finding container 5c91c61fdedd7ac3b55f157313b39d5da2758e7741b49a653fbe9b5afd78473b: Status 404 returned error can't find the container with id 5c91c61fdedd7ac3b55f157313b39d5da2758e7741b49a653fbe9b5afd78473b Mar 21 05:48:33 crc kubenswrapper[4775]: I0321 05:48:33.360078 4775 generic.go:334] "Generic (PLEG): container finished" podID="c04654f3-0f1c-4883-8385-4149f0076d44" containerID="7531509e6b8b22386d2d189b980c4f48fe41789688a7cacac28bf362ed5c6b17" exitCode=0 Mar 21 05:48:33 crc kubenswrapper[4775]: I0321 05:48:33.360167 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" event={"ID":"c04654f3-0f1c-4883-8385-4149f0076d44","Type":"ContainerDied","Data":"7531509e6b8b22386d2d189b980c4f48fe41789688a7cacac28bf362ed5c6b17"} Mar 21 05:48:33 crc kubenswrapper[4775]: I0321 05:48:33.360347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" event={"ID":"c04654f3-0f1c-4883-8385-4149f0076d44","Type":"ContainerStarted","Data":"5c91c61fdedd7ac3b55f157313b39d5da2758e7741b49a653fbe9b5afd78473b"} Mar 21 05:48:33 crc kubenswrapper[4775]: I0321 05:48:33.403088 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-k6ct4"] Mar 21 05:48:33 crc kubenswrapper[4775]: I0321 05:48:33.413500 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8t2r/crc-debug-k6ct4"] Mar 21 05:48:34 crc kubenswrapper[4775]: I0321 05:48:34.474095 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:34 crc kubenswrapper[4775]: I0321 05:48:34.647272 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9mn2\" (UniqueName: \"kubernetes.io/projected/c04654f3-0f1c-4883-8385-4149f0076d44-kube-api-access-m9mn2\") pod \"c04654f3-0f1c-4883-8385-4149f0076d44\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " Mar 21 05:48:34 crc kubenswrapper[4775]: I0321 05:48:34.647422 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c04654f3-0f1c-4883-8385-4149f0076d44-host\") pod \"c04654f3-0f1c-4883-8385-4149f0076d44\" (UID: \"c04654f3-0f1c-4883-8385-4149f0076d44\") " Mar 21 05:48:34 crc kubenswrapper[4775]: I0321 05:48:34.647853 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c04654f3-0f1c-4883-8385-4149f0076d44-host" (OuterVolumeSpecName: "host") pod "c04654f3-0f1c-4883-8385-4149f0076d44" (UID: "c04654f3-0f1c-4883-8385-4149f0076d44"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:48:34 crc kubenswrapper[4775]: I0321 05:48:34.648177 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c04654f3-0f1c-4883-8385-4149f0076d44-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:34 crc kubenswrapper[4775]: I0321 05:48:34.653329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04654f3-0f1c-4883-8385-4149f0076d44-kube-api-access-m9mn2" (OuterVolumeSpecName: "kube-api-access-m9mn2") pod "c04654f3-0f1c-4883-8385-4149f0076d44" (UID: "c04654f3-0f1c-4883-8385-4149f0076d44"). InnerVolumeSpecName "kube-api-access-m9mn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:48:34 crc kubenswrapper[4775]: I0321 05:48:34.750411 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9mn2\" (UniqueName: \"kubernetes.io/projected/c04654f3-0f1c-4883-8385-4149f0076d44-kube-api-access-m9mn2\") on node \"crc\" DevicePath \"\"" Mar 21 05:48:35 crc kubenswrapper[4775]: I0321 05:48:35.897379 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c04654f3-0f1c-4883-8385-4149f0076d44" path="/var/lib/kubelet/pods/c04654f3-0f1c-4883-8385-4149f0076d44/volumes" Mar 21 05:48:35 crc kubenswrapper[4775]: I0321 05:48:35.918609 4775 scope.go:117] "RemoveContainer" containerID="7531509e6b8b22386d2d189b980c4f48fe41789688a7cacac28bf362ed5c6b17" Mar 21 05:48:35 crc kubenswrapper[4775]: I0321 05:48:35.918955 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/crc-debug-k6ct4" Mar 21 05:48:48 crc kubenswrapper[4775]: I0321 05:48:48.957581 4775 scope.go:117] "RemoveContainer" containerID="5c92c80d8920da595f41beeabde7d0200200f2dfb0bec0c216c4b59513f9ce40" Mar 21 05:48:50 crc kubenswrapper[4775]: I0321 05:48:50.437185 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77fd86567d-mf2wb_1727040d-36f5-431c-b8f1-84e206146dcf/barbican-api/0.log" Mar 21 05:48:50 crc kubenswrapper[4775]: I0321 05:48:50.572397 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77fd86567d-mf2wb_1727040d-36f5-431c-b8f1-84e206146dcf/barbican-api-log/0.log" Mar 21 05:48:50 crc kubenswrapper[4775]: I0321 05:48:50.625859 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f5b547c8-mgjw5_177228f6-7f69-49c2-9942-ea0a98b56d13/barbican-keystone-listener/0.log" Mar 21 05:48:50 crc kubenswrapper[4775]: I0321 05:48:50.701598 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f5b547c8-mgjw5_177228f6-7f69-49c2-9942-ea0a98b56d13/barbican-keystone-listener-log/0.log" Mar 21 05:48:50 crc kubenswrapper[4775]: I0321 05:48:50.916190 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6dd9f89d55-fdf8c_d8a7c2e5-3643-4675-9888-3c310e4f9ad4/barbican-worker/0.log" Mar 21 05:48:50 crc kubenswrapper[4775]: I0321 05:48:50.924071 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6dd9f89d55-fdf8c_d8a7c2e5-3643-4675-9888-3c310e4f9ad4/barbican-worker-log/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.242214 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m_203df932-0574-4098-b897-ba50813f2ec1/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.354015 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/ceilometer-central-agent/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.392935 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/ceilometer-notification-agent/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.449906 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/proxy-httpd/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.485141 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/sg-core/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.632277 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c16d835a-1ec2-473d-b2d8-c8e7c978e140/cinder-api/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.689903 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c16d835a-1ec2-473d-b2d8-c8e7c978e140/cinder-api-log/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.815198 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d02bb319-292e-449c-8e5b-42c6859f1529/cinder-scheduler/0.log" Mar 21 05:48:51 crc kubenswrapper[4775]: I0321 05:48:51.839665 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d02bb319-292e-449c-8e5b-42c6859f1529/probe/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.007295 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7_5efe4255-484c-47d7-800a-4d0dbc5cecd9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.196594 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn_8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.258038 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-mmgp9_4918607c-6074-4fb3-a0a0-8def479058a0/init/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.493684 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-mmgp9_4918607c-6074-4fb3-a0a0-8def479058a0/init/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.558744 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-mmgp9_4918607c-6074-4fb3-a0a0-8def479058a0/dnsmasq-dns/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.564350 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-h87nr_28040d61-c9ea-4a55-b113-db871dff679c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.756675 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_01638b90-5e17-43b3-a3b5-90726b26e243/glance-httpd/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.795261 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_01638b90-5e17-43b3-a3b5-90726b26e243/glance-log/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.922510 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9e72c4b-a3fb-41eb-974a-74d24d6cdac9/glance-httpd/0.log" Mar 21 05:48:52 crc kubenswrapper[4775]: I0321 05:48:52.956443 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9e72c4b-a3fb-41eb-974a-74d24d6cdac9/glance-log/0.log" Mar 21 05:48:53 crc kubenswrapper[4775]: I0321 05:48:53.184975 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6496ddbdd4-v5mc5_fc6e433f-9e70-4b09-9780-403634bbe0dc/horizon/0.log" Mar 21 05:48:53 crc kubenswrapper[4775]: I0321 05:48:53.353740 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f_8390751b-3911-4a24-a1a2-c3d1d10da875/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:53 crc kubenswrapper[4775]: I0321 05:48:53.494854 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6496ddbdd4-v5mc5_fc6e433f-9e70-4b09-9780-403634bbe0dc/horizon-log/0.log" Mar 21 05:48:53 crc kubenswrapper[4775]: I0321 05:48:53.735696 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kfrz2_dbd64e65-be8d-42e7-a686-d5454932156d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:53 crc kubenswrapper[4775]: I0321 05:48:53.881319 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7ee3b7a0-9eb3-4702-8fb7-3286df60b21b/kube-state-metrics/0.log" Mar 21 05:48:53 crc kubenswrapper[4775]: I0321 05:48:53.913430 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-66c879cfdd-smnxp_9f965feb-5d82-4176-a14d-08a84c4ae794/keystone-api/0.log" Mar 21 05:48:54 crc kubenswrapper[4775]: I0321 05:48:54.525858 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d65998c7c-prp5b_3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef/neutron-api/0.log" Mar 21 05:48:54 crc kubenswrapper[4775]: I0321 05:48:54.647762 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d65998c7c-prp5b_3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef/neutron-httpd/0.log" Mar 21 05:48:54 crc kubenswrapper[4775]: I0321 05:48:54.734058 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz_a71dcc90-c70a-4ff8-bf4a-42f1a2415827/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:55 crc kubenswrapper[4775]: I0321 05:48:55.012748 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9_29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:55 crc kubenswrapper[4775]: I0321 05:48:55.480501 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_789a0bb8-b131-4144-9400-7c32a604d6d5/nova-cell0-conductor-conductor/0.log" Mar 21 05:48:55 crc kubenswrapper[4775]: I0321 05:48:55.542453 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_886c404c-ceec-48e7-90da-96d6aa201152/nova-api-log/0.log" Mar 21 05:48:55 crc kubenswrapper[4775]: I0321 05:48:55.762563 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_886c404c-ceec-48e7-90da-96d6aa201152/nova-api-api/0.log" Mar 21 05:48:55 crc kubenswrapper[4775]: I0321 05:48:55.817790 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b29323a7-476f-4a13-8085-4b2158a68850/nova-cell1-conductor-conductor/0.log" Mar 21 05:48:55 crc kubenswrapper[4775]: I0321 05:48:55.902220 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3b7ea443-e30d-41d1-9f42-0bef9d7bd012/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:48:56 crc kubenswrapper[4775]: I0321 05:48:56.369532 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_208cfa71-8242-4958-b9db-21fc180a6697/nova-metadata-log/0.log" Mar 21 05:48:56 crc kubenswrapper[4775]: I0321 05:48:56.766795 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-t85b7_50003f97-774c-4321-9ddf-6ac67546b19f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:56 crc kubenswrapper[4775]: I0321 05:48:56.777469 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_208cfa71-8242-4958-b9db-21fc180a6697/nova-metadata-metadata/0.log" Mar 21 05:48:56 crc kubenswrapper[4775]: I0321 05:48:56.801159 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bab55731-40da-4831-a8b5-f9c413452367/nova-scheduler-scheduler/0.log" Mar 21 05:48:56 crc kubenswrapper[4775]: I0321 05:48:56.918560 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bade9789-f227-44ab-b7fa-2173445cd381/mysql-bootstrap/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.111405 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bade9789-f227-44ab-b7fa-2173445cd381/mysql-bootstrap/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.212828 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_76d205b7-bc2e-4dad-b513-457ff20d67e1/mysql-bootstrap/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.231599 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bade9789-f227-44ab-b7fa-2173445cd381/galera/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.434940 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_76d205b7-bc2e-4dad-b513-457ff20d67e1/mysql-bootstrap/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.441563 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_bb5a0456-b5c5-433a-afde-fe38740e2310/openstackclient/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.446935 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_76d205b7-bc2e-4dad-b513-457ff20d67e1/galera/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.695550 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ckh8b_9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8/openstack-network-exporter/0.log" Mar 21 05:48:57 crc kubenswrapper[4775]: I0321 05:48:57.788934 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nmtjx_8a8e948c-2978-40c8-961b-1b010f7ea920/ovn-controller/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.015640 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovsdb-server-init/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.233301 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovsdb-server-init/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.243792 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovs-vswitchd/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.247404 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovsdb-server/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.508381 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_334a5c95-becc-4389-bb6f-50e5957cded6/openstack-network-exporter/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.530942 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-45kkj_3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.557522 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_334a5c95-becc-4389-bb6f-50e5957cded6/ovn-northd/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.768994 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e/openstack-network-exporter/0.log" Mar 21 05:48:58 crc kubenswrapper[4775]: I0321 05:48:58.839207 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e/ovsdbserver-nb/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.172478 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93886182-fca2-42a9-a134-2243c7c7073d/openstack-network-exporter/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.207587 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93886182-fca2-42a9-a134-2243c7c7073d/ovsdbserver-sb/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.383910 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58948d8bb4-rcw89_279dff90-9d39-418a-b5e7-00333a376d16/placement-api/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.478964 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58948d8bb4-rcw89_279dff90-9d39-418a-b5e7-00333a376d16/placement-log/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.564741 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c95486b5-f2ad-4098-912d-6749b329824b/setup-container/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.731046 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c95486b5-f2ad-4098-912d-6749b329824b/setup-container/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.800011 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c95486b5-f2ad-4098-912d-6749b329824b/rabbitmq/0.log" Mar 21 05:48:59 crc kubenswrapper[4775]: I0321 05:48:59.874012 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e5e83941-a38d-4ee9-b967-1dac69c5a55b/setup-container/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.068847 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e5e83941-a38d-4ee9-b967-1dac69c5a55b/setup-container/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.100806 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e5e83941-a38d-4ee9-b967-1dac69c5a55b/rabbitmq/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.124039 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6_034f630d-d6d6-41f0-8df6-e5db37b778f3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.320314 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qgpzc_7c88e417-9ede-41d8-8337-79620ceb7798/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.405993 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m_f41367b2-433d-48f7-af75-575be4b318fc/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.617921 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wtk52_8cf3d7cc-425b-4d40-a26c-a88d2b210c0a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.635504 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-swg9k_5f45d376-4f59-4584-a545-16d4ff066232/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.903290 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f9b88fb79-vclnv_77432545-f22c-453a-b6a7-7c932712efa9/proxy-server/0.log" Mar 21 05:49:00 crc kubenswrapper[4775]: I0321 05:49:00.947066 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f9b88fb79-vclnv_77432545-f22c-453a-b6a7-7c932712efa9/proxy-httpd/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.030619 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kzll6_9e521c27-9d67-47bc-b6ac-74fabb543d3f/swift-ring-rebalance/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.205157 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-auditor/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.251183 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-reaper/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.287061 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-replicator/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.460562 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-replicator/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.487453 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-auditor/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.499398 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-server/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.505157 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-server/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.717765 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-expirer/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.727712 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-updater/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.734905 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-auditor/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.766479 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-replicator/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.945833 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-server/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.949048 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-updater/0.log" Mar 21 05:49:01 crc kubenswrapper[4775]: I0321 05:49:01.969013 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/rsync/0.log" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.005800 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/swift-recon-cron/0.log" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.257864 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1c832898-838d-423d-8ad8-512c5ee5706c/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.465605 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2c99faa4-db71-4a05-a018-9c382f33f55e/test-operator-logs-container/0.log" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.481738 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.481805 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.481858 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.483247 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.483309 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" gracePeriod=600 Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.626192 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg_bccbefa9-966d-44b9-bd8f-bb566649b315/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:49:02 crc kubenswrapper[4775]: E0321 05:49:02.679702 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:49:02 crc kubenswrapper[4775]: I0321 05:49:02.755676 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mfft7_268b27f0-a217-459e-9502-7b522ca6fe2c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:49:03 crc kubenswrapper[4775]: I0321 05:49:03.171188 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" exitCode=0 Mar 21 05:49:03 crc kubenswrapper[4775]: I0321 05:49:03.171224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2"} Mar 21 05:49:03 crc kubenswrapper[4775]: I0321 05:49:03.171273 4775 scope.go:117] "RemoveContainer" containerID="27384562242ad4f001715c32b1c4d680359f25816c963e4f026f07e0e28e85a2" Mar 21 05:49:03 crc kubenswrapper[4775]: I0321 05:49:03.172107 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:49:03 crc kubenswrapper[4775]: E0321 05:49:03.172505 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:49:11 crc kubenswrapper[4775]: I0321 05:49:11.743425 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2a8305a2-5178-437f-a896-314b34fa595e/memcached/0.log" Mar 21 05:49:14 crc kubenswrapper[4775]: I0321 05:49:14.661718 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:49:14 crc kubenswrapper[4775]: E0321 05:49:14.662554 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:49:29 crc kubenswrapper[4775]: I0321 05:49:29.661294 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:49:29 crc kubenswrapper[4775]: E0321 05:49:29.661887 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.093606 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/util/0.log" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.226902 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/util/0.log" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.257350 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/pull/0.log" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.305941 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/pull/0.log" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.511747 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/extract/0.log" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.530268 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/util/0.log" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.537819 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/pull/0.log" Mar 21 05:49:32 crc kubenswrapper[4775]: I0321 05:49:32.766456 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-twhxx_94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef/manager/0.log" Mar 21 05:49:33 crc kubenswrapper[4775]: I0321 05:49:33.085081 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-lxvtw_87dcea67-7f65-46a6-996b-3985bf1b5171/manager/0.log" Mar 21 05:49:33 crc kubenswrapper[4775]: I0321 05:49:33.282009 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-lzvsq_80932361-6406-48dd-9e4b-4e9c27813f68/manager/0.log" Mar 21 05:49:33 crc kubenswrapper[4775]: I0321 05:49:33.458669 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-m4gqz_0e83601c-758c-4f12-b745-bb68b0c4904f/manager/0.log" Mar 21 05:49:33 crc kubenswrapper[4775]: I0321 05:49:33.635609 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-7bdbc_01ed348d-8a8a-4717-ba0d-1944b3f1c081/manager/0.log" Mar 21 05:49:33 crc kubenswrapper[4775]: I0321 05:49:33.681738 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dn22m_9fe71acc-7d35-4d4b-ac69-e193d3f39028/manager/0.log" Mar 21 05:49:33 crc kubenswrapper[4775]: I0321 05:49:33.957069 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-65f65cc49c-2mgp8_0a66456f-7860-4dc1-9c1c-0db69ddcc800/manager/0.log" Mar 21 05:49:34 crc kubenswrapper[4775]: I0321 05:49:34.123839 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-9g8pv_f00c8c4b-874f-45ec-8a1a-e0834b3fc252/manager/0.log" Mar 21 05:49:34 crc kubenswrapper[4775]: I0321 05:49:34.284335 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-jq88h_20c78c73-daf5-481e-a4ac-62de73b5969e/manager/0.log" Mar 21 05:49:34 crc kubenswrapper[4775]: I0321 05:49:34.331290 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-c7pjw_49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd/manager/0.log" Mar 21 05:49:34 crc kubenswrapper[4775]: I0321 05:49:34.571959 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-hf688_ec38d53e-6fe4-41b5-8548-e49fadd9d6bf/manager/0.log" Mar 21 05:49:34 crc kubenswrapper[4775]: I0321 05:49:34.581914 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-4tbvg_9bfe7d25-53ea-484d-a481-0ea04ee2b8a8/manager/0.log" Mar 21 05:49:34 crc kubenswrapper[4775]: I0321 05:49:34.803016 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-67scw_745c79b1-1bcf-4c0f-82ee-a26cbba46d48/manager/0.log" Mar 21 05:49:34 crc kubenswrapper[4775]: I0321 05:49:34.921295 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-tss7r_02b6af47-2c06-480b-a838-2d742efa1045/manager/0.log" Mar 21 05:49:35 crc kubenswrapper[4775]: I0321 05:49:35.033474 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-rwq59_3035739a-202f-4794-bb4f-ae2342a96441/manager/0.log" Mar 21 05:49:35 crc kubenswrapper[4775]: I0321 05:49:35.137515 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-85fcfb8fbb-q2k4k_899f7d20-7208-419e-b0f8-36c7fbf2e841/operator/0.log" Mar 21 05:49:35 crc kubenswrapper[4775]: I0321 05:49:35.384014 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hvtkg_8c7426e8-8cec-4c84-8810-03a091d87cd9/registry-server/0.log" Mar 21 05:49:35 crc kubenswrapper[4775]: I0321 05:49:35.604953 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-ctm9h_6eeb04ad-7251-488c-bd52-b2f14f6fb68b/manager/0.log" Mar 21 05:49:35 crc kubenswrapper[4775]: I0321 05:49:35.861284 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-jj4pt_898b32c5-9f21-4fba-90c5-a333f36addf2/manager/0.log" Mar 21 05:49:36 crc kubenswrapper[4775]: I0321 05:49:36.078596 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-jvglf_95a44b12-e027-400d-b257-99f2012251d8/operator/0.log" Mar 21 05:49:36 crc kubenswrapper[4775]: I0321 05:49:36.251958 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-lqmgv_5968f1d9-f4e0-4c67-923e-2494e15c4088/manager/0.log" Mar 21 05:49:36 crc kubenswrapper[4775]: I0321 05:49:36.414230 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-9s82h_d1dbd80a-0782-4035-a263-b52a90f6ee0e/manager/0.log" Mar 21 05:49:36 crc kubenswrapper[4775]: I0321 05:49:36.518541 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-l59gx_9cac78ed-6325-4649-bb05-a1518ae692e9/manager/0.log" Mar 21 05:49:36 crc kubenswrapper[4775]: I0321 05:49:36.544935 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65746ff4dc-hg4rq_907f0cdf-2d87-4d09-97af-5591d061b4f6/manager/0.log" Mar 21 05:49:36 crc kubenswrapper[4775]: I0321 05:49:36.665471 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-gdrc5_3c9f18bd-def6-45ff-a92b-25c6f40d6bb5/manager/0.log" Mar 21 05:49:40 crc kubenswrapper[4775]: I0321 05:49:40.661604 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:49:40 crc kubenswrapper[4775]: E0321 05:49:40.662245 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:49:55 crc kubenswrapper[4775]: I0321 05:49:55.662397 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:49:55 crc kubenswrapper[4775]: E0321 05:49:55.663253 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:49:57 crc kubenswrapper[4775]: I0321 05:49:57.226140 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-d7xpn_e5224539-6d29-4bc3-9656-4665eb287e28/control-plane-machine-set-operator/0.log" Mar 21 05:49:57 crc kubenswrapper[4775]: I0321 05:49:57.429022 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5rnj6_fe3df1e1-4c22-48df-aaea-469c864f0310/kube-rbac-proxy/0.log" Mar 21 05:49:57 crc kubenswrapper[4775]: I0321 05:49:57.468523 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5rnj6_fe3df1e1-4c22-48df-aaea-469c864f0310/machine-api-operator/0.log" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.144015 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567870-hdgmt"] Mar 21 05:50:00 crc kubenswrapper[4775]: E0321 05:50:00.144980 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04654f3-0f1c-4883-8385-4149f0076d44" containerName="container-00" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.144999 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04654f3-0f1c-4883-8385-4149f0076d44" containerName="container-00" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.145298 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04654f3-0f1c-4883-8385-4149f0076d44" containerName="container-00" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.146174 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.148422 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.148498 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.149483 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.152880 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-hdgmt"] Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.295953 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kckln\" (UniqueName: \"kubernetes.io/projected/3b54d80b-a103-474c-8214-44ec9b595a43-kube-api-access-kckln\") pod \"auto-csr-approver-29567870-hdgmt\" (UID: \"3b54d80b-a103-474c-8214-44ec9b595a43\") " pod="openshift-infra/auto-csr-approver-29567870-hdgmt" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.398349 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kckln\" (UniqueName: \"kubernetes.io/projected/3b54d80b-a103-474c-8214-44ec9b595a43-kube-api-access-kckln\") pod \"auto-csr-approver-29567870-hdgmt\" (UID: \"3b54d80b-a103-474c-8214-44ec9b595a43\") " pod="openshift-infra/auto-csr-approver-29567870-hdgmt" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.427315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kckln\" (UniqueName: \"kubernetes.io/projected/3b54d80b-a103-474c-8214-44ec9b595a43-kube-api-access-kckln\") pod \"auto-csr-approver-29567870-hdgmt\" (UID: \"3b54d80b-a103-474c-8214-44ec9b595a43\") " pod="openshift-infra/auto-csr-approver-29567870-hdgmt" Mar 21 05:50:00 crc kubenswrapper[4775]: I0321 05:50:00.517798 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" Mar 21 05:50:01 crc kubenswrapper[4775]: I0321 05:50:01.002337 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-hdgmt"] Mar 21 05:50:01 crc kubenswrapper[4775]: I0321 05:50:01.005431 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:50:01 crc kubenswrapper[4775]: I0321 05:50:01.632011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" event={"ID":"3b54d80b-a103-474c-8214-44ec9b595a43","Type":"ContainerStarted","Data":"cda58da76eb41a6cdea286216edcfc30a8f820e91ef784de81a6113e9dc45d9b"} Mar 21 05:50:02 crc kubenswrapper[4775]: I0321 05:50:02.641503 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" event={"ID":"3b54d80b-a103-474c-8214-44ec9b595a43","Type":"ContainerStarted","Data":"21e7854b61e199821209ba3f9487dda8cf82083673134efff058e8fd96d17f3f"} Mar 21 05:50:03 crc kubenswrapper[4775]: I0321 05:50:03.653153 4775 generic.go:334] "Generic (PLEG): container finished" podID="3b54d80b-a103-474c-8214-44ec9b595a43" containerID="21e7854b61e199821209ba3f9487dda8cf82083673134efff058e8fd96d17f3f" exitCode=0 Mar 21 05:50:03 crc kubenswrapper[4775]: I0321 05:50:03.653231 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" event={"ID":"3b54d80b-a103-474c-8214-44ec9b595a43","Type":"ContainerDied","Data":"21e7854b61e199821209ba3f9487dda8cf82083673134efff058e8fd96d17f3f"} Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.075836 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.216928 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kckln\" (UniqueName: \"kubernetes.io/projected/3b54d80b-a103-474c-8214-44ec9b595a43-kube-api-access-kckln\") pod \"3b54d80b-a103-474c-8214-44ec9b595a43\" (UID: \"3b54d80b-a103-474c-8214-44ec9b595a43\") " Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.222788 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b54d80b-a103-474c-8214-44ec9b595a43-kube-api-access-kckln" (OuterVolumeSpecName: "kube-api-access-kckln") pod "3b54d80b-a103-474c-8214-44ec9b595a43" (UID: "3b54d80b-a103-474c-8214-44ec9b595a43"). InnerVolumeSpecName "kube-api-access-kckln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.319836 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kckln\" (UniqueName: \"kubernetes.io/projected/3b54d80b-a103-474c-8214-44ec9b595a43-kube-api-access-kckln\") on node \"crc\" DevicePath \"\"" Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.673022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" event={"ID":"3b54d80b-a103-474c-8214-44ec9b595a43","Type":"ContainerDied","Data":"cda58da76eb41a6cdea286216edcfc30a8f820e91ef784de81a6113e9dc45d9b"} Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.673073 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cda58da76eb41a6cdea286216edcfc30a8f820e91ef784de81a6113e9dc45d9b" Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.673669 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567870-hdgmt" Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.734590 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-zsztj"] Mar 21 05:50:05 crc kubenswrapper[4775]: I0321 05:50:05.745087 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567864-zsztj"] Mar 21 05:50:07 crc kubenswrapper[4775]: I0321 05:50:07.671533 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3565ae50-5b89-4775-a8c2-45209b86619f" path="/var/lib/kubelet/pods/3565ae50-5b89-4775-a8c2-45209b86619f/volumes" Mar 21 05:50:09 crc kubenswrapper[4775]: I0321 05:50:09.662028 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:50:09 crc kubenswrapper[4775]: E0321 05:50:09.662795 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:50:10 crc kubenswrapper[4775]: I0321 05:50:10.469722 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-c2lsw_4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe/cert-manager-controller/0.log" Mar 21 05:50:10 crc kubenswrapper[4775]: I0321 05:50:10.684412 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ngckr_172b2006-3394-469a-be7f-1b66d020fd45/cert-manager-cainjector/0.log" Mar 21 05:50:10 crc kubenswrapper[4775]: I0321 05:50:10.736971 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ms9hv_670f734f-e215-441e-9b56-7251bc7f2484/cert-manager-webhook/0.log" Mar 21 05:50:22 crc kubenswrapper[4775]: I0321 05:50:22.661485 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:50:22 crc kubenswrapper[4775]: E0321 05:50:22.662458 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:50:23 crc kubenswrapper[4775]: I0321 05:50:23.735456 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-8x7x2_b0fbab95-1c88-40a3-8ccb-58bca74c8f3c/nmstate-console-plugin/0.log" Mar 21 05:50:23 crc kubenswrapper[4775]: I0321 05:50:23.924302 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6l74f_d03e5939-1625-4597-ad3b-9edf8e8075f5/nmstate-handler/0.log" Mar 21 05:50:23 crc kubenswrapper[4775]: I0321 05:50:23.990907 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-626kh_409422f2-717f-4f82-8dae-fe01dfda7083/kube-rbac-proxy/0.log" Mar 21 05:50:24 crc kubenswrapper[4775]: I0321 05:50:24.059703 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-626kh_409422f2-717f-4f82-8dae-fe01dfda7083/nmstate-metrics/0.log" Mar 21 05:50:24 crc kubenswrapper[4775]: I0321 05:50:24.187061 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-xljlp_f4a0a79a-5b67-44f8-9ef5-304530d5e764/nmstate-operator/0.log" Mar 21 05:50:24 crc kubenswrapper[4775]: I0321 05:50:24.259318 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rptn7_944b76e5-c8c5-4cba-9df2-9e9b87b540a8/nmstate-webhook/0.log" Mar 21 05:50:33 crc kubenswrapper[4775]: I0321 05:50:33.662016 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:50:33 crc kubenswrapper[4775]: E0321 05:50:33.662809 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:50:48 crc kubenswrapper[4775]: I0321 05:50:48.661578 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:50:48 crc kubenswrapper[4775]: E0321 05:50:48.662463 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:50:49 crc kubenswrapper[4775]: I0321 05:50:49.252413 4775 scope.go:117] "RemoveContainer" containerID="dbc52e5cee44a0750d2409abd45a8584ff150c5de4624d45b211f61e09e87efd" Mar 21 05:50:50 crc kubenswrapper[4775]: I0321 05:50:50.327530 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qq976_ac0be1f3-95f0-40a4-9a94-c74cdaad9590/kube-rbac-proxy/0.log" Mar 21 05:50:50 crc kubenswrapper[4775]: I0321 05:50:50.444085 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qq976_ac0be1f3-95f0-40a4-9a94-c74cdaad9590/controller/0.log" Mar 21 05:50:50 crc kubenswrapper[4775]: I0321 05:50:50.559571 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 05:50:50 crc kubenswrapper[4775]: I0321 05:50:50.786712 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 05:50:50 crc kubenswrapper[4775]: I0321 05:50:50.794817 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 05:50:50 crc kubenswrapper[4775]: I0321 05:50:50.823083 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 05:50:50 crc kubenswrapper[4775]: I0321 05:50:50.830707 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.022324 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.077079 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.100477 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.102315 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.368788 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.389628 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.400762 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.404569 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/controller/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.636960 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/frr-metrics/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.651689 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/kube-rbac-proxy-frr/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.692415 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/kube-rbac-proxy/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.871223 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/reloader/0.log" Mar 21 05:50:51 crc kubenswrapper[4775]: I0321 05:50:51.940896 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-7jsnx_e5808d9b-074a-4948-8283-fdfea77c63bc/frr-k8s-webhook-server/0.log" Mar 21 05:50:52 crc kubenswrapper[4775]: I0321 05:50:52.222231 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6695f56dbb-f6tqn_38d80f78-fa33-49b7-99c2-62d50d1c011b/manager/0.log" Mar 21 05:50:52 crc kubenswrapper[4775]: I0321 05:50:52.572916 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-559bfcf5c-qqsvn_8c52832d-1aee-4eac-b625-24110b985402/webhook-server/0.log" Mar 21 05:50:52 crc kubenswrapper[4775]: I0321 05:50:52.744513 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cpw6m_1d9349cc-e186-40c5-bb71-c176ff4f0a0d/kube-rbac-proxy/0.log" Mar 21 05:50:52 crc kubenswrapper[4775]: I0321 05:50:52.857453 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/frr/0.log" Mar 21 05:50:53 crc kubenswrapper[4775]: I0321 05:50:53.206712 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cpw6m_1d9349cc-e186-40c5-bb71-c176ff4f0a0d/speaker/0.log" Mar 21 05:51:03 crc kubenswrapper[4775]: I0321 05:51:03.661502 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:51:03 crc kubenswrapper[4775]: E0321 05:51:03.662339 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:51:06 crc kubenswrapper[4775]: I0321 05:51:06.789886 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/util/0.log" Mar 21 05:51:06 crc kubenswrapper[4775]: I0321 05:51:06.998202 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/util/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.018981 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/pull/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.024482 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/pull/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.175411 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/util/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.208105 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/extract/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.246037 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/pull/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.344142 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/util/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.517983 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/util/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.533065 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/pull/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.538750 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/pull/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.729237 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/pull/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.746748 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/util/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.765424 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/extract/0.log" Mar 21 05:51:07 crc kubenswrapper[4775]: I0321 05:51:07.919085 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-utilities/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.100741 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-content/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.121669 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-utilities/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.123020 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-content/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.334055 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-content/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.352129 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-utilities/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.566760 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-utilities/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.841401 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-content/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.883077 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-utilities/0.log" Mar 21 05:51:08 crc kubenswrapper[4775]: I0321 05:51:08.901084 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-content/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.007097 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/registry-server/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.084548 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-content/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.087952 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-utilities/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.442408 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z7fcx_59fec450-4b61-4a15-b1b5-b47dedd649a0/marketplace-operator/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.527579 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-utilities/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.746811 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-utilities/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.770313 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-content/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.792010 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-content/0.log" Mar 21 05:51:09 crc kubenswrapper[4775]: I0321 05:51:09.798625 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/registry-server/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.017153 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-content/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.019553 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-utilities/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.065087 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/registry-server/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.203699 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-utilities/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.388421 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-content/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.401580 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-content/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.411495 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-utilities/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.597776 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-utilities/0.log" Mar 21 05:51:10 crc kubenswrapper[4775]: I0321 05:51:10.629830 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-content/0.log" Mar 21 05:51:11 crc kubenswrapper[4775]: I0321 05:51:11.733534 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/registry-server/0.log" Mar 21 05:51:15 crc kubenswrapper[4775]: I0321 05:51:15.661768 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:51:15 crc kubenswrapper[4775]: E0321 05:51:15.662439 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.443844 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-snsfc"] Mar 21 05:51:20 crc kubenswrapper[4775]: E0321 05:51:20.444924 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b54d80b-a103-474c-8214-44ec9b595a43" containerName="oc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.444937 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b54d80b-a103-474c-8214-44ec9b595a43" containerName="oc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.445162 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b54d80b-a103-474c-8214-44ec9b595a43" containerName="oc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.446538 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.456665 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snsfc"] Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.517220 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-utilities\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.517290 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqhpz\" (UniqueName: \"kubernetes.io/projected/79dd9ac3-231b-4217-a717-a3d127b0c0ca-kube-api-access-jqhpz\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.517449 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-catalog-content\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.619766 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-catalog-content\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.619930 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-utilities\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.619992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqhpz\" (UniqueName: \"kubernetes.io/projected/79dd9ac3-231b-4217-a717-a3d127b0c0ca-kube-api-access-jqhpz\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.620363 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-catalog-content\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.620680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-utilities\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.641207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqhpz\" (UniqueName: \"kubernetes.io/projected/79dd9ac3-231b-4217-a717-a3d127b0c0ca-kube-api-access-jqhpz\") pod \"redhat-operators-snsfc\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:20 crc kubenswrapper[4775]: I0321 05:51:20.766382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:21 crc kubenswrapper[4775]: I0321 05:51:21.314699 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snsfc"] Mar 21 05:51:21 crc kubenswrapper[4775]: I0321 05:51:21.384047 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snsfc" event={"ID":"79dd9ac3-231b-4217-a717-a3d127b0c0ca","Type":"ContainerStarted","Data":"60650042fc6959477421db51b293d07bc331592b53cc199339128f46e38e52de"} Mar 21 05:51:22 crc kubenswrapper[4775]: I0321 05:51:22.393570 4775 generic.go:334] "Generic (PLEG): container finished" podID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerID="a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6" exitCode=0 Mar 21 05:51:22 crc kubenswrapper[4775]: I0321 05:51:22.393781 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snsfc" event={"ID":"79dd9ac3-231b-4217-a717-a3d127b0c0ca","Type":"ContainerDied","Data":"a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6"} Mar 21 05:51:23 crc kubenswrapper[4775]: I0321 05:51:23.404759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snsfc" event={"ID":"79dd9ac3-231b-4217-a717-a3d127b0c0ca","Type":"ContainerStarted","Data":"6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d"} Mar 21 05:51:29 crc kubenswrapper[4775]: I0321 05:51:29.500647 4775 generic.go:334] "Generic (PLEG): container finished" podID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerID="6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d" exitCode=0 Mar 21 05:51:29 crc kubenswrapper[4775]: I0321 05:51:29.500940 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snsfc" event={"ID":"79dd9ac3-231b-4217-a717-a3d127b0c0ca","Type":"ContainerDied","Data":"6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d"} Mar 21 05:51:30 crc kubenswrapper[4775]: I0321 05:51:30.661318 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:51:30 crc kubenswrapper[4775]: E0321 05:51:30.661897 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:51:31 crc kubenswrapper[4775]: I0321 05:51:31.521448 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snsfc" event={"ID":"79dd9ac3-231b-4217-a717-a3d127b0c0ca","Type":"ContainerStarted","Data":"a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341"} Mar 21 05:51:40 crc kubenswrapper[4775]: I0321 05:51:40.768488 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:40 crc kubenswrapper[4775]: I0321 05:51:40.770130 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:51:41 crc kubenswrapper[4775]: I0321 05:51:41.661688 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:51:41 crc kubenswrapper[4775]: E0321 05:51:41.662226 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:51:41 crc kubenswrapper[4775]: I0321 05:51:41.831871 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snsfc" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" probeResult="failure" output=< Mar 21 05:51:41 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:51:41 crc kubenswrapper[4775]: > Mar 21 05:51:51 crc kubenswrapper[4775]: I0321 05:51:51.823317 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snsfc" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" probeResult="failure" output=< Mar 21 05:51:51 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:51:51 crc kubenswrapper[4775]: > Mar 21 05:51:53 crc kubenswrapper[4775]: I0321 05:51:53.661801 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:51:53 crc kubenswrapper[4775]: E0321 05:51:53.662535 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.150342 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-snsfc" podStartSLOduration=32.012277472 podStartE2EDuration="40.150318753s" podCreationTimestamp="2026-03-21 05:51:20 +0000 UTC" firstStartedPulling="2026-03-21 05:51:22.39574965 +0000 UTC m=+3835.372213274" lastFinishedPulling="2026-03-21 05:51:30.533790931 +0000 UTC m=+3843.510254555" observedRunningTime="2026-03-21 05:51:31.55570931 +0000 UTC m=+3844.532172944" watchObservedRunningTime="2026-03-21 05:52:00.150318753 +0000 UTC m=+3873.126782397" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.151836 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567872-ctnt2"] Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.153433 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-ctnt2" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.155850 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.156146 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.158829 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.163897 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-ctnt2"] Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.201883 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5npp\" (UniqueName: \"kubernetes.io/projected/b19b93e1-3fea-4999-8e31-ba5cc64c91ea-kube-api-access-v5npp\") pod \"auto-csr-approver-29567872-ctnt2\" (UID: \"b19b93e1-3fea-4999-8e31-ba5cc64c91ea\") " pod="openshift-infra/auto-csr-approver-29567872-ctnt2" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.303271 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5npp\" (UniqueName: \"kubernetes.io/projected/b19b93e1-3fea-4999-8e31-ba5cc64c91ea-kube-api-access-v5npp\") pod \"auto-csr-approver-29567872-ctnt2\" (UID: \"b19b93e1-3fea-4999-8e31-ba5cc64c91ea\") " pod="openshift-infra/auto-csr-approver-29567872-ctnt2" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.329329 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5npp\" (UniqueName: \"kubernetes.io/projected/b19b93e1-3fea-4999-8e31-ba5cc64c91ea-kube-api-access-v5npp\") pod \"auto-csr-approver-29567872-ctnt2\" (UID: \"b19b93e1-3fea-4999-8e31-ba5cc64c91ea\") " pod="openshift-infra/auto-csr-approver-29567872-ctnt2" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.471331 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-ctnt2" Mar 21 05:52:00 crc kubenswrapper[4775]: I0321 05:52:00.933258 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-ctnt2"] Mar 21 05:52:01 crc kubenswrapper[4775]: I0321 05:52:01.812557 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567872-ctnt2" event={"ID":"b19b93e1-3fea-4999-8e31-ba5cc64c91ea","Type":"ContainerStarted","Data":"f7d5d467cc0169abf2339a995d4bae6f7dc8fca7c98d6a4d83edb19d0351f20b"} Mar 21 05:52:01 crc kubenswrapper[4775]: I0321 05:52:01.826469 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snsfc" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" probeResult="failure" output=< Mar 21 05:52:01 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:52:01 crc kubenswrapper[4775]: > Mar 21 05:52:03 crc kubenswrapper[4775]: I0321 05:52:03.839061 4775 generic.go:334] "Generic (PLEG): container finished" podID="b19b93e1-3fea-4999-8e31-ba5cc64c91ea" containerID="25394576ff3d5198021128077858159917ead74d0d3b8d2389ae8fb099e54be8" exitCode=0 Mar 21 05:52:03 crc kubenswrapper[4775]: I0321 05:52:03.839439 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567872-ctnt2" event={"ID":"b19b93e1-3fea-4999-8e31-ba5cc64c91ea","Type":"ContainerDied","Data":"25394576ff3d5198021128077858159917ead74d0d3b8d2389ae8fb099e54be8"} Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.233849 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-ctnt2" Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.424921 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5npp\" (UniqueName: \"kubernetes.io/projected/b19b93e1-3fea-4999-8e31-ba5cc64c91ea-kube-api-access-v5npp\") pod \"b19b93e1-3fea-4999-8e31-ba5cc64c91ea\" (UID: \"b19b93e1-3fea-4999-8e31-ba5cc64c91ea\") " Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.431767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19b93e1-3fea-4999-8e31-ba5cc64c91ea-kube-api-access-v5npp" (OuterVolumeSpecName: "kube-api-access-v5npp") pod "b19b93e1-3fea-4999-8e31-ba5cc64c91ea" (UID: "b19b93e1-3fea-4999-8e31-ba5cc64c91ea"). InnerVolumeSpecName "kube-api-access-v5npp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.527871 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5npp\" (UniqueName: \"kubernetes.io/projected/b19b93e1-3fea-4999-8e31-ba5cc64c91ea-kube-api-access-v5npp\") on node \"crc\" DevicePath \"\"" Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.661751 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:52:05 crc kubenswrapper[4775]: E0321 05:52:05.662104 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.865069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567872-ctnt2" event={"ID":"b19b93e1-3fea-4999-8e31-ba5cc64c91ea","Type":"ContainerDied","Data":"f7d5d467cc0169abf2339a995d4bae6f7dc8fca7c98d6a4d83edb19d0351f20b"} Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.865160 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7d5d467cc0169abf2339a995d4bae6f7dc8fca7c98d6a4d83edb19d0351f20b" Mar 21 05:52:05 crc kubenswrapper[4775]: I0321 05:52:05.865180 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567872-ctnt2" Mar 21 05:52:06 crc kubenswrapper[4775]: I0321 05:52:06.322450 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-r9qrw"] Mar 21 05:52:06 crc kubenswrapper[4775]: I0321 05:52:06.331086 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567866-r9qrw"] Mar 21 05:52:07 crc kubenswrapper[4775]: I0321 05:52:07.676983 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52968d61-f2a8-4bcd-9e97-cb01c508a52e" path="/var/lib/kubelet/pods/52968d61-f2a8-4bcd-9e97-cb01c508a52e/volumes" Mar 21 05:52:11 crc kubenswrapper[4775]: I0321 05:52:11.867427 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snsfc" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" probeResult="failure" output=< Mar 21 05:52:11 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:52:11 crc kubenswrapper[4775]: > Mar 21 05:52:18 crc kubenswrapper[4775]: I0321 05:52:18.661666 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:52:18 crc kubenswrapper[4775]: E0321 05:52:18.662392 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:52:21 crc kubenswrapper[4775]: I0321 05:52:21.838900 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snsfc" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" probeResult="failure" output=< Mar 21 05:52:21 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:52:21 crc kubenswrapper[4775]: > Mar 21 05:52:31 crc kubenswrapper[4775]: I0321 05:52:31.836355 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snsfc" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" probeResult="failure" output=< Mar 21 05:52:31 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 05:52:31 crc kubenswrapper[4775]: > Mar 21 05:52:33 crc kubenswrapper[4775]: I0321 05:52:33.661330 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:52:33 crc kubenswrapper[4775]: E0321 05:52:33.662030 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:52:40 crc kubenswrapper[4775]: I0321 05:52:40.836305 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:52:40 crc kubenswrapper[4775]: I0321 05:52:40.896629 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:52:41 crc kubenswrapper[4775]: I0321 05:52:41.074939 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snsfc"] Mar 21 05:52:42 crc kubenswrapper[4775]: I0321 05:52:42.196358 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-snsfc" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" containerID="cri-o://a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341" gracePeriod=2 Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.156334 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.208360 4775 generic.go:334] "Generic (PLEG): container finished" podID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerID="a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341" exitCode=0 Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.208406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snsfc" event={"ID":"79dd9ac3-231b-4217-a717-a3d127b0c0ca","Type":"ContainerDied","Data":"a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341"} Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.208435 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snsfc" event={"ID":"79dd9ac3-231b-4217-a717-a3d127b0c0ca","Type":"ContainerDied","Data":"60650042fc6959477421db51b293d07bc331592b53cc199339128f46e38e52de"} Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.208453 4775 scope.go:117] "RemoveContainer" containerID="a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.208604 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snsfc" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.234779 4775 scope.go:117] "RemoveContainer" containerID="6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.242443 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-utilities\") pod \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.242629 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqhpz\" (UniqueName: \"kubernetes.io/projected/79dd9ac3-231b-4217-a717-a3d127b0c0ca-kube-api-access-jqhpz\") pod \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.242745 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-catalog-content\") pod \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\" (UID: \"79dd9ac3-231b-4217-a717-a3d127b0c0ca\") " Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.243457 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-utilities" (OuterVolumeSpecName: "utilities") pod "79dd9ac3-231b-4217-a717-a3d127b0c0ca" (UID: "79dd9ac3-231b-4217-a717-a3d127b0c0ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.250797 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79dd9ac3-231b-4217-a717-a3d127b0c0ca-kube-api-access-jqhpz" (OuterVolumeSpecName: "kube-api-access-jqhpz") pod "79dd9ac3-231b-4217-a717-a3d127b0c0ca" (UID: "79dd9ac3-231b-4217-a717-a3d127b0c0ca"). InnerVolumeSpecName "kube-api-access-jqhpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.259313 4775 scope.go:117] "RemoveContainer" containerID="a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.345835 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqhpz\" (UniqueName: \"kubernetes.io/projected/79dd9ac3-231b-4217-a717-a3d127b0c0ca-kube-api-access-jqhpz\") on node \"crc\" DevicePath \"\"" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.345903 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.354475 4775 scope.go:117] "RemoveContainer" containerID="a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341" Mar 21 05:52:43 crc kubenswrapper[4775]: E0321 05:52:43.354901 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341\": container with ID starting with a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341 not found: ID does not exist" containerID="a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.354946 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341"} err="failed to get container status \"a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341\": rpc error: code = NotFound desc = could not find container \"a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341\": container with ID starting with a9c141457cd1a8a69f1b8a93fbbbea5921c7cfed496f1b33f8197ea1b1d27341 not found: ID does not exist" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.354975 4775 scope.go:117] "RemoveContainer" containerID="6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d" Mar 21 05:52:43 crc kubenswrapper[4775]: E0321 05:52:43.355282 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d\": container with ID starting with 6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d not found: ID does not exist" containerID="6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.355333 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d"} err="failed to get container status \"6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d\": rpc error: code = NotFound desc = could not find container \"6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d\": container with ID starting with 6b761fc8b5585d54f920ab7f638b663e14b509ba0f6869c22d0a1c572cfc5a8d not found: ID does not exist" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.355367 4775 scope.go:117] "RemoveContainer" containerID="a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6" Mar 21 05:52:43 crc kubenswrapper[4775]: E0321 05:52:43.355684 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6\": container with ID starting with a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6 not found: ID does not exist" containerID="a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.355716 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6"} err="failed to get container status \"a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6\": rpc error: code = NotFound desc = could not find container \"a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6\": container with ID starting with a695039e53bef0f3edbd44ff68e271754c9a8e6448d909abcaca31bcc1cb9ed6 not found: ID does not exist" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.378912 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79dd9ac3-231b-4217-a717-a3d127b0c0ca" (UID: "79dd9ac3-231b-4217-a717-a3d127b0c0ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.448862 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79dd9ac3-231b-4217-a717-a3d127b0c0ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.545534 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snsfc"] Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.554383 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-snsfc"] Mar 21 05:52:43 crc kubenswrapper[4775]: I0321 05:52:43.674334 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" path="/var/lib/kubelet/pods/79dd9ac3-231b-4217-a717-a3d127b0c0ca/volumes" Mar 21 05:52:46 crc kubenswrapper[4775]: I0321 05:52:46.661717 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:52:46 crc kubenswrapper[4775]: E0321 05:52:46.662329 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:52:49 crc kubenswrapper[4775]: I0321 05:52:49.364570 4775 scope.go:117] "RemoveContainer" containerID="875a527a3f14334ab80b89530523916cfe6a1d4008ed65204aa4a0f63ca04790" Mar 21 05:53:00 crc kubenswrapper[4775]: I0321 05:53:00.662818 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:53:00 crc kubenswrapper[4775]: E0321 05:53:00.663867 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:53:12 crc kubenswrapper[4775]: I0321 05:53:12.472370 4775 generic.go:334] "Generic (PLEG): container finished" podID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerID="891104e8ee294ddf739928b04e0a1b0e8f6bf51e7a90dfe50eb44a5a6ed19b18" exitCode=0 Mar 21 05:53:12 crc kubenswrapper[4775]: I0321 05:53:12.472451 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8t2r/must-gather-znpkx" event={"ID":"77cb0b91-097f-4e81-888d-57b2d37399bd","Type":"ContainerDied","Data":"891104e8ee294ddf739928b04e0a1b0e8f6bf51e7a90dfe50eb44a5a6ed19b18"} Mar 21 05:53:12 crc kubenswrapper[4775]: I0321 05:53:12.473563 4775 scope.go:117] "RemoveContainer" containerID="891104e8ee294ddf739928b04e0a1b0e8f6bf51e7a90dfe50eb44a5a6ed19b18" Mar 21 05:53:12 crc kubenswrapper[4775]: I0321 05:53:12.661923 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:53:12 crc kubenswrapper[4775]: E0321 05:53:12.662219 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:53:13 crc kubenswrapper[4775]: I0321 05:53:13.059081 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8t2r_must-gather-znpkx_77cb0b91-097f-4e81-888d-57b2d37399bd/gather/0.log" Mar 21 05:53:20 crc kubenswrapper[4775]: I0321 05:53:20.944329 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b8t2r/must-gather-znpkx"] Mar 21 05:53:20 crc kubenswrapper[4775]: I0321 05:53:20.945279 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b8t2r/must-gather-znpkx" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerName="copy" containerID="cri-o://76eca6bf5897fa3d0bd946a5ac8c7c93a3f123f745216d5491ccbb010515a8cc" gracePeriod=2 Mar 21 05:53:20 crc kubenswrapper[4775]: I0321 05:53:20.965528 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b8t2r/must-gather-znpkx"] Mar 21 05:53:21 crc kubenswrapper[4775]: I0321 05:53:21.559620 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8t2r_must-gather-znpkx_77cb0b91-097f-4e81-888d-57b2d37399bd/copy/0.log" Mar 21 05:53:21 crc kubenswrapper[4775]: I0321 05:53:21.560364 4775 generic.go:334] "Generic (PLEG): container finished" podID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerID="76eca6bf5897fa3d0bd946a5ac8c7c93a3f123f745216d5491ccbb010515a8cc" exitCode=143 Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.171361 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8t2r_must-gather-znpkx_77cb0b91-097f-4e81-888d-57b2d37399bd/copy/0.log" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.172070 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.277782 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd8wn\" (UniqueName: \"kubernetes.io/projected/77cb0b91-097f-4e81-888d-57b2d37399bd-kube-api-access-cd8wn\") pod \"77cb0b91-097f-4e81-888d-57b2d37399bd\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.278141 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77cb0b91-097f-4e81-888d-57b2d37399bd-must-gather-output\") pod \"77cb0b91-097f-4e81-888d-57b2d37399bd\" (UID: \"77cb0b91-097f-4e81-888d-57b2d37399bd\") " Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.294567 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77cb0b91-097f-4e81-888d-57b2d37399bd-kube-api-access-cd8wn" (OuterVolumeSpecName: "kube-api-access-cd8wn") pod "77cb0b91-097f-4e81-888d-57b2d37399bd" (UID: "77cb0b91-097f-4e81-888d-57b2d37399bd"). InnerVolumeSpecName "kube-api-access-cd8wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.381075 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd8wn\" (UniqueName: \"kubernetes.io/projected/77cb0b91-097f-4e81-888d-57b2d37399bd-kube-api-access-cd8wn\") on node \"crc\" DevicePath \"\"" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.446696 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77cb0b91-097f-4e81-888d-57b2d37399bd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "77cb0b91-097f-4e81-888d-57b2d37399bd" (UID: "77cb0b91-097f-4e81-888d-57b2d37399bd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.483595 4775 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/77cb0b91-097f-4e81-888d-57b2d37399bd-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.574096 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b8t2r_must-gather-znpkx_77cb0b91-097f-4e81-888d-57b2d37399bd/copy/0.log" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.574509 4775 scope.go:117] "RemoveContainer" containerID="76eca6bf5897fa3d0bd946a5ac8c7c93a3f123f745216d5491ccbb010515a8cc" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.574665 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8t2r/must-gather-znpkx" Mar 21 05:53:22 crc kubenswrapper[4775]: I0321 05:53:22.607629 4775 scope.go:117] "RemoveContainer" containerID="891104e8ee294ddf739928b04e0a1b0e8f6bf51e7a90dfe50eb44a5a6ed19b18" Mar 21 05:53:23 crc kubenswrapper[4775]: I0321 05:53:23.677998 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" path="/var/lib/kubelet/pods/77cb0b91-097f-4e81-888d-57b2d37399bd/volumes" Mar 21 05:53:25 crc kubenswrapper[4775]: I0321 05:53:25.662334 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:53:25 crc kubenswrapper[4775]: E0321 05:53:25.662732 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:53:39 crc kubenswrapper[4775]: I0321 05:53:39.660999 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:53:39 crc kubenswrapper[4775]: E0321 05:53:39.661907 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:53:54 crc kubenswrapper[4775]: I0321 05:53:54.662409 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:53:54 crc kubenswrapper[4775]: E0321 05:53:54.663305 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.147788 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567874-8f686"] Mar 21 05:54:00 crc kubenswrapper[4775]: E0321 05:54:00.150474 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19b93e1-3fea-4999-8e31-ba5cc64c91ea" containerName="oc" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150510 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19b93e1-3fea-4999-8e31-ba5cc64c91ea" containerName="oc" Mar 21 05:54:00 crc kubenswrapper[4775]: E0321 05:54:00.150529 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150537 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" Mar 21 05:54:00 crc kubenswrapper[4775]: E0321 05:54:00.150559 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="extract-content" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150565 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="extract-content" Mar 21 05:54:00 crc kubenswrapper[4775]: E0321 05:54:00.150574 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerName="gather" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150582 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerName="gather" Mar 21 05:54:00 crc kubenswrapper[4775]: E0321 05:54:00.150597 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerName="copy" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150604 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerName="copy" Mar 21 05:54:00 crc kubenswrapper[4775]: E0321 05:54:00.150626 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="extract-utilities" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150632 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="extract-utilities" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150834 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="79dd9ac3-231b-4217-a717-a3d127b0c0ca" containerName="registry-server" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150848 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerName="gather" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150866 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="77cb0b91-097f-4e81-888d-57b2d37399bd" containerName="copy" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.150876 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19b93e1-3fea-4999-8e31-ba5cc64c91ea" containerName="oc" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.151666 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-8f686" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.153481 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.154559 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.154587 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.161567 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-8f686"] Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.327437 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw286\" (UniqueName: \"kubernetes.io/projected/a6ebee46-23d7-4d38-946e-10f7a3238243-kube-api-access-rw286\") pod \"auto-csr-approver-29567874-8f686\" (UID: \"a6ebee46-23d7-4d38-946e-10f7a3238243\") " pod="openshift-infra/auto-csr-approver-29567874-8f686" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.429296 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw286\" (UniqueName: \"kubernetes.io/projected/a6ebee46-23d7-4d38-946e-10f7a3238243-kube-api-access-rw286\") pod \"auto-csr-approver-29567874-8f686\" (UID: \"a6ebee46-23d7-4d38-946e-10f7a3238243\") " pod="openshift-infra/auto-csr-approver-29567874-8f686" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.451106 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw286\" (UniqueName: \"kubernetes.io/projected/a6ebee46-23d7-4d38-946e-10f7a3238243-kube-api-access-rw286\") pod \"auto-csr-approver-29567874-8f686\" (UID: \"a6ebee46-23d7-4d38-946e-10f7a3238243\") " pod="openshift-infra/auto-csr-approver-29567874-8f686" Mar 21 05:54:00 crc kubenswrapper[4775]: I0321 05:54:00.476402 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-8f686" Mar 21 05:54:01 crc kubenswrapper[4775]: I0321 05:54:01.091636 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-8f686"] Mar 21 05:54:01 crc kubenswrapper[4775]: I0321 05:54:01.961744 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567874-8f686" event={"ID":"a6ebee46-23d7-4d38-946e-10f7a3238243","Type":"ContainerStarted","Data":"82a5c3a691a3d2dd7f12928070512565332eb06c5d5ceac43233d9692df468a7"} Mar 21 05:54:02 crc kubenswrapper[4775]: E0321 05:54:02.869812 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6ebee46_23d7_4d38_946e_10f7a3238243.slice/crio-conmon-b68cd9cb06e02040464cbd016f2e9b85be0a632889c514eaecf91f324598c72a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6ebee46_23d7_4d38_946e_10f7a3238243.slice/crio-b68cd9cb06e02040464cbd016f2e9b85be0a632889c514eaecf91f324598c72a.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:54:02 crc kubenswrapper[4775]: I0321 05:54:02.972827 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6ebee46-23d7-4d38-946e-10f7a3238243" containerID="b68cd9cb06e02040464cbd016f2e9b85be0a632889c514eaecf91f324598c72a" exitCode=0 Mar 21 05:54:02 crc kubenswrapper[4775]: I0321 05:54:02.972879 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567874-8f686" event={"ID":"a6ebee46-23d7-4d38-946e-10f7a3238243","Type":"ContainerDied","Data":"b68cd9cb06e02040464cbd016f2e9b85be0a632889c514eaecf91f324598c72a"} Mar 21 05:54:04 crc kubenswrapper[4775]: I0321 05:54:04.366817 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-8f686" Mar 21 05:54:04 crc kubenswrapper[4775]: I0321 05:54:04.521722 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw286\" (UniqueName: \"kubernetes.io/projected/a6ebee46-23d7-4d38-946e-10f7a3238243-kube-api-access-rw286\") pod \"a6ebee46-23d7-4d38-946e-10f7a3238243\" (UID: \"a6ebee46-23d7-4d38-946e-10f7a3238243\") " Mar 21 05:54:04 crc kubenswrapper[4775]: I0321 05:54:04.527979 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ebee46-23d7-4d38-946e-10f7a3238243-kube-api-access-rw286" (OuterVolumeSpecName: "kube-api-access-rw286") pod "a6ebee46-23d7-4d38-946e-10f7a3238243" (UID: "a6ebee46-23d7-4d38-946e-10f7a3238243"). InnerVolumeSpecName "kube-api-access-rw286". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:54:04 crc kubenswrapper[4775]: I0321 05:54:04.624335 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw286\" (UniqueName: \"kubernetes.io/projected/a6ebee46-23d7-4d38-946e-10f7a3238243-kube-api-access-rw286\") on node \"crc\" DevicePath \"\"" Mar 21 05:54:04 crc kubenswrapper[4775]: I0321 05:54:04.989701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567874-8f686" event={"ID":"a6ebee46-23d7-4d38-946e-10f7a3238243","Type":"ContainerDied","Data":"82a5c3a691a3d2dd7f12928070512565332eb06c5d5ceac43233d9692df468a7"} Mar 21 05:54:04 crc kubenswrapper[4775]: I0321 05:54:04.989739 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82a5c3a691a3d2dd7f12928070512565332eb06c5d5ceac43233d9692df468a7" Mar 21 05:54:04 crc kubenswrapper[4775]: I0321 05:54:04.989820 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567874-8f686" Mar 21 05:54:05 crc kubenswrapper[4775]: I0321 05:54:05.453762 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-8z86d"] Mar 21 05:54:05 crc kubenswrapper[4775]: I0321 05:54:05.462244 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567868-8z86d"] Mar 21 05:54:05 crc kubenswrapper[4775]: I0321 05:54:05.671677 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdeb3ead-3f1f-491f-9a15-f550065d18fb" path="/var/lib/kubelet/pods/bdeb3ead-3f1f-491f-9a15-f550065d18fb/volumes" Mar 21 05:54:07 crc kubenswrapper[4775]: I0321 05:54:07.670779 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:54:08 crc kubenswrapper[4775]: I0321 05:54:08.021734 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"fb650740af569c16423f24bd3abda67b78d6d33afa7ae7aa33a8a75f71e85f1c"} Mar 21 05:54:49 crc kubenswrapper[4775]: I0321 05:54:49.486974 4775 scope.go:117] "RemoveContainer" containerID="6995f01e339d899d20a0310c4264870e5b4687e152b20b7bc6f33c17d3fccaea" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.623755 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xwtsn"] Mar 21 05:55:45 crc kubenswrapper[4775]: E0321 05:55:45.624780 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ebee46-23d7-4d38-946e-10f7a3238243" containerName="oc" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.624795 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ebee46-23d7-4d38-946e-10f7a3238243" containerName="oc" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.624994 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ebee46-23d7-4d38-946e-10f7a3238243" containerName="oc" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.627469 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.643246 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwtsn"] Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.762677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-catalog-content\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.762742 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn4wz\" (UniqueName: \"kubernetes.io/projected/a2cc63cf-968e-4695-bb7a-bd8938511c29-kube-api-access-qn4wz\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.762777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-utilities\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.864673 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-catalog-content\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.864744 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn4wz\" (UniqueName: \"kubernetes.io/projected/a2cc63cf-968e-4695-bb7a-bd8938511c29-kube-api-access-qn4wz\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.864782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-utilities\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.865248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-catalog-content\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.865481 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-utilities\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.897155 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn4wz\" (UniqueName: \"kubernetes.io/projected/a2cc63cf-968e-4695-bb7a-bd8938511c29-kube-api-access-qn4wz\") pod \"certified-operators-xwtsn\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:45 crc kubenswrapper[4775]: I0321 05:55:45.958783 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:46 crc kubenswrapper[4775]: I0321 05:55:46.459806 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwtsn"] Mar 21 05:55:46 crc kubenswrapper[4775]: I0321 05:55:46.971949 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerStarted","Data":"e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd"} Mar 21 05:55:46 crc kubenswrapper[4775]: I0321 05:55:46.972286 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerStarted","Data":"61e7a835618349eda7c91ee47d139dfad4d5f13ad9f24f1aa5d97237bd5616ae"} Mar 21 05:55:47 crc kubenswrapper[4775]: I0321 05:55:47.980791 4775 generic.go:334] "Generic (PLEG): container finished" podID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerID="e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd" exitCode=0 Mar 21 05:55:47 crc kubenswrapper[4775]: I0321 05:55:47.981142 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerDied","Data":"e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd"} Mar 21 05:55:47 crc kubenswrapper[4775]: I0321 05:55:47.984170 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:55:48 crc kubenswrapper[4775]: I0321 05:55:48.997797 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerStarted","Data":"3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4"} Mar 21 05:55:50 crc kubenswrapper[4775]: I0321 05:55:50.007382 4775 generic.go:334] "Generic (PLEG): container finished" podID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerID="3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4" exitCode=0 Mar 21 05:55:50 crc kubenswrapper[4775]: I0321 05:55:50.007433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerDied","Data":"3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4"} Mar 21 05:55:51 crc kubenswrapper[4775]: I0321 05:55:51.020846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerStarted","Data":"cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d"} Mar 21 05:55:55 crc kubenswrapper[4775]: I0321 05:55:55.960019 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:55 crc kubenswrapper[4775]: I0321 05:55:55.961617 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:56 crc kubenswrapper[4775]: I0321 05:55:56.010324 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:56 crc kubenswrapper[4775]: I0321 05:55:56.027025 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xwtsn" podStartSLOduration=8.391415355 podStartE2EDuration="11.027005997s" podCreationTimestamp="2026-03-21 05:55:45 +0000 UTC" firstStartedPulling="2026-03-21 05:55:47.983893001 +0000 UTC m=+4100.960356625" lastFinishedPulling="2026-03-21 05:55:50.619483643 +0000 UTC m=+4103.595947267" observedRunningTime="2026-03-21 05:55:51.046813658 +0000 UTC m=+4104.023277322" watchObservedRunningTime="2026-03-21 05:55:56.027005997 +0000 UTC m=+4109.003469621" Mar 21 05:55:56 crc kubenswrapper[4775]: I0321 05:55:56.108489 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:56 crc kubenswrapper[4775]: I0321 05:55:56.249818 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwtsn"] Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.080203 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xwtsn" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="registry-server" containerID="cri-o://cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d" gracePeriod=2 Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.806298 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.923196 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-utilities\") pod \"a2cc63cf-968e-4695-bb7a-bd8938511c29\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.923268 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-catalog-content\") pod \"a2cc63cf-968e-4695-bb7a-bd8938511c29\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.923372 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn4wz\" (UniqueName: \"kubernetes.io/projected/a2cc63cf-968e-4695-bb7a-bd8938511c29-kube-api-access-qn4wz\") pod \"a2cc63cf-968e-4695-bb7a-bd8938511c29\" (UID: \"a2cc63cf-968e-4695-bb7a-bd8938511c29\") " Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.924374 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-utilities" (OuterVolumeSpecName: "utilities") pod "a2cc63cf-968e-4695-bb7a-bd8938511c29" (UID: "a2cc63cf-968e-4695-bb7a-bd8938511c29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.929594 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2cc63cf-968e-4695-bb7a-bd8938511c29-kube-api-access-qn4wz" (OuterVolumeSpecName: "kube-api-access-qn4wz") pod "a2cc63cf-968e-4695-bb7a-bd8938511c29" (UID: "a2cc63cf-968e-4695-bb7a-bd8938511c29"). InnerVolumeSpecName "kube-api-access-qn4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:55:58 crc kubenswrapper[4775]: I0321 05:55:58.991663 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2cc63cf-968e-4695-bb7a-bd8938511c29" (UID: "a2cc63cf-968e-4695-bb7a-bd8938511c29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.025735 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.025790 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2cc63cf-968e-4695-bb7a-bd8938511c29-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.025808 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn4wz\" (UniqueName: \"kubernetes.io/projected/a2cc63cf-968e-4695-bb7a-bd8938511c29-kube-api-access-qn4wz\") on node \"crc\" DevicePath \"\"" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.091711 4775 generic.go:334] "Generic (PLEG): container finished" podID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerID="cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d" exitCode=0 Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.091769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerDied","Data":"cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d"} Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.091821 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwtsn" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.091833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwtsn" event={"ID":"a2cc63cf-968e-4695-bb7a-bd8938511c29","Type":"ContainerDied","Data":"61e7a835618349eda7c91ee47d139dfad4d5f13ad9f24f1aa5d97237bd5616ae"} Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.091860 4775 scope.go:117] "RemoveContainer" containerID="cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.114464 4775 scope.go:117] "RemoveContainer" containerID="3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.128798 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwtsn"] Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.138675 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xwtsn"] Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.145317 4775 scope.go:117] "RemoveContainer" containerID="e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.186602 4775 scope.go:117] "RemoveContainer" containerID="cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d" Mar 21 05:55:59 crc kubenswrapper[4775]: E0321 05:55:59.187294 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d\": container with ID starting with cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d not found: ID does not exist" containerID="cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.187334 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d"} err="failed to get container status \"cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d\": rpc error: code = NotFound desc = could not find container \"cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d\": container with ID starting with cb36d15d10b6e43c11f3066b543cd0135060dab87e1c9fb8292210fb109dc69d not found: ID does not exist" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.187360 4775 scope.go:117] "RemoveContainer" containerID="3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4" Mar 21 05:55:59 crc kubenswrapper[4775]: E0321 05:55:59.187799 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4\": container with ID starting with 3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4 not found: ID does not exist" containerID="3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.187826 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4"} err="failed to get container status \"3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4\": rpc error: code = NotFound desc = could not find container \"3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4\": container with ID starting with 3797b6886ae493c02ef28cc86da9e4074a1002593383e3ae858d4f7bf02221e4 not found: ID does not exist" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.187841 4775 scope.go:117] "RemoveContainer" containerID="e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd" Mar 21 05:55:59 crc kubenswrapper[4775]: E0321 05:55:59.188285 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd\": container with ID starting with e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd not found: ID does not exist" containerID="e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.188308 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd"} err="failed to get container status \"e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd\": rpc error: code = NotFound desc = could not find container \"e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd\": container with ID starting with e43955e05be1c11e78ad56fed49e1899127312a909955f2f2464f37be4e186fd not found: ID does not exist" Mar 21 05:55:59 crc kubenswrapper[4775]: I0321 05:55:59.675079 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" path="/var/lib/kubelet/pods/a2cc63cf-968e-4695-bb7a-bd8938511c29/volumes" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.148437 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567876-cl4xc"] Mar 21 05:56:00 crc kubenswrapper[4775]: E0321 05:56:00.149288 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="extract-content" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.149304 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="extract-content" Mar 21 05:56:00 crc kubenswrapper[4775]: E0321 05:56:00.149349 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="registry-server" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.149357 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="registry-server" Mar 21 05:56:00 crc kubenswrapper[4775]: E0321 05:56:00.149383 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="extract-utilities" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.149394 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="extract-utilities" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.149686 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2cc63cf-968e-4695-bb7a-bd8938511c29" containerName="registry-server" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.150510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-cl4xc" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.153187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.153640 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.153805 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.159154 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-cl4xc"] Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.251197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw5bd\" (UniqueName: \"kubernetes.io/projected/77bcfe47-3b2b-4864-8659-69053e25c0a7-kube-api-access-rw5bd\") pod \"auto-csr-approver-29567876-cl4xc\" (UID: \"77bcfe47-3b2b-4864-8659-69053e25c0a7\") " pod="openshift-infra/auto-csr-approver-29567876-cl4xc" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.353589 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw5bd\" (UniqueName: \"kubernetes.io/projected/77bcfe47-3b2b-4864-8659-69053e25c0a7-kube-api-access-rw5bd\") pod \"auto-csr-approver-29567876-cl4xc\" (UID: \"77bcfe47-3b2b-4864-8659-69053e25c0a7\") " pod="openshift-infra/auto-csr-approver-29567876-cl4xc" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.600722 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw5bd\" (UniqueName: \"kubernetes.io/projected/77bcfe47-3b2b-4864-8659-69053e25c0a7-kube-api-access-rw5bd\") pod \"auto-csr-approver-29567876-cl4xc\" (UID: \"77bcfe47-3b2b-4864-8659-69053e25c0a7\") " pod="openshift-infra/auto-csr-approver-29567876-cl4xc" Mar 21 05:56:00 crc kubenswrapper[4775]: I0321 05:56:00.770138 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-cl4xc" Mar 21 05:56:01 crc kubenswrapper[4775]: W0321 05:56:01.236251 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77bcfe47_3b2b_4864_8659_69053e25c0a7.slice/crio-42715204dec9b60bfad42994e43ccd1afd991606987f22c43b1f2b84ef7f590a WatchSource:0}: Error finding container 42715204dec9b60bfad42994e43ccd1afd991606987f22c43b1f2b84ef7f590a: Status 404 returned error can't find the container with id 42715204dec9b60bfad42994e43ccd1afd991606987f22c43b1f2b84ef7f590a Mar 21 05:56:01 crc kubenswrapper[4775]: I0321 05:56:01.239099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-cl4xc"] Mar 21 05:56:02 crc kubenswrapper[4775]: I0321 05:56:02.123630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567876-cl4xc" event={"ID":"77bcfe47-3b2b-4864-8659-69053e25c0a7","Type":"ContainerStarted","Data":"42715204dec9b60bfad42994e43ccd1afd991606987f22c43b1f2b84ef7f590a"} Mar 21 05:56:03 crc kubenswrapper[4775]: I0321 05:56:03.133950 4775 generic.go:334] "Generic (PLEG): container finished" podID="77bcfe47-3b2b-4864-8659-69053e25c0a7" containerID="ff881789564104cd73abc465716a89e9559dbe987cab602aecfdcfd1bd6865c8" exitCode=0 Mar 21 05:56:03 crc kubenswrapper[4775]: I0321 05:56:03.134054 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567876-cl4xc" event={"ID":"77bcfe47-3b2b-4864-8659-69053e25c0a7","Type":"ContainerDied","Data":"ff881789564104cd73abc465716a89e9559dbe987cab602aecfdcfd1bd6865c8"} Mar 21 05:56:04 crc kubenswrapper[4775]: I0321 05:56:04.598536 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-cl4xc" Mar 21 05:56:04 crc kubenswrapper[4775]: I0321 05:56:04.643733 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw5bd\" (UniqueName: \"kubernetes.io/projected/77bcfe47-3b2b-4864-8659-69053e25c0a7-kube-api-access-rw5bd\") pod \"77bcfe47-3b2b-4864-8659-69053e25c0a7\" (UID: \"77bcfe47-3b2b-4864-8659-69053e25c0a7\") " Mar 21 05:56:04 crc kubenswrapper[4775]: I0321 05:56:04.653850 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bcfe47-3b2b-4864-8659-69053e25c0a7-kube-api-access-rw5bd" (OuterVolumeSpecName: "kube-api-access-rw5bd") pod "77bcfe47-3b2b-4864-8659-69053e25c0a7" (UID: "77bcfe47-3b2b-4864-8659-69053e25c0a7"). InnerVolumeSpecName "kube-api-access-rw5bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:56:04 crc kubenswrapper[4775]: I0321 05:56:04.747192 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw5bd\" (UniqueName: \"kubernetes.io/projected/77bcfe47-3b2b-4864-8659-69053e25c0a7-kube-api-access-rw5bd\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:05 crc kubenswrapper[4775]: I0321 05:56:05.160895 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567876-cl4xc" event={"ID":"77bcfe47-3b2b-4864-8659-69053e25c0a7","Type":"ContainerDied","Data":"42715204dec9b60bfad42994e43ccd1afd991606987f22c43b1f2b84ef7f590a"} Mar 21 05:56:05 crc kubenswrapper[4775]: I0321 05:56:05.160939 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567876-cl4xc" Mar 21 05:56:05 crc kubenswrapper[4775]: I0321 05:56:05.160946 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42715204dec9b60bfad42994e43ccd1afd991606987f22c43b1f2b84ef7f590a" Mar 21 05:56:05 crc kubenswrapper[4775]: I0321 05:56:05.675266 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-hdgmt"] Mar 21 05:56:05 crc kubenswrapper[4775]: I0321 05:56:05.707304 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567870-hdgmt"] Mar 21 05:56:07 crc kubenswrapper[4775]: I0321 05:56:07.672817 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b54d80b-a103-474c-8214-44ec9b595a43" path="/var/lib/kubelet/pods/3b54d80b-a103-474c-8214-44ec9b595a43/volumes" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.245012 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvhrr"] Mar 21 05:56:23 crc kubenswrapper[4775]: E0321 05:56:23.246325 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bcfe47-3b2b-4864-8659-69053e25c0a7" containerName="oc" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.246344 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bcfe47-3b2b-4864-8659-69053e25c0a7" containerName="oc" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.247114 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bcfe47-3b2b-4864-8659-69053e25c0a7" containerName="oc" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.248913 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.269819 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvhrr"] Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.301097 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-utilities\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.301184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v54rl\" (UniqueName: \"kubernetes.io/projected/a9132264-94c1-407d-87ab-d7f546907220-kube-api-access-v54rl\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.301223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-catalog-content\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.402950 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v54rl\" (UniqueName: \"kubernetes.io/projected/a9132264-94c1-407d-87ab-d7f546907220-kube-api-access-v54rl\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.403040 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-catalog-content\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.403207 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-utilities\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.403644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-catalog-content\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.403683 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-utilities\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.429086 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v54rl\" (UniqueName: \"kubernetes.io/projected/a9132264-94c1-407d-87ab-d7f546907220-kube-api-access-v54rl\") pod \"community-operators-hvhrr\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:23 crc kubenswrapper[4775]: I0321 05:56:23.581861 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:24 crc kubenswrapper[4775]: I0321 05:56:24.154461 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvhrr"] Mar 21 05:56:24 crc kubenswrapper[4775]: I0321 05:56:24.880089 4775 generic.go:334] "Generic (PLEG): container finished" podID="a9132264-94c1-407d-87ab-d7f546907220" containerID="56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf" exitCode=0 Mar 21 05:56:24 crc kubenswrapper[4775]: I0321 05:56:24.880209 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvhrr" event={"ID":"a9132264-94c1-407d-87ab-d7f546907220","Type":"ContainerDied","Data":"56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf"} Mar 21 05:56:24 crc kubenswrapper[4775]: I0321 05:56:24.880456 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvhrr" event={"ID":"a9132264-94c1-407d-87ab-d7f546907220","Type":"ContainerStarted","Data":"2585c889d397a5cc818abfd7d487d4c812588c6d0795639c8158b4cc4e2799db"} Mar 21 05:56:25 crc kubenswrapper[4775]: I0321 05:56:25.901487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvhrr" event={"ID":"a9132264-94c1-407d-87ab-d7f546907220","Type":"ContainerStarted","Data":"db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259"} Mar 21 05:56:27 crc kubenswrapper[4775]: I0321 05:56:27.939621 4775 generic.go:334] "Generic (PLEG): container finished" podID="a9132264-94c1-407d-87ab-d7f546907220" containerID="db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259" exitCode=0 Mar 21 05:56:27 crc kubenswrapper[4775]: I0321 05:56:27.939669 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvhrr" event={"ID":"a9132264-94c1-407d-87ab-d7f546907220","Type":"ContainerDied","Data":"db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259"} Mar 21 05:56:28 crc kubenswrapper[4775]: I0321 05:56:28.950026 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvhrr" event={"ID":"a9132264-94c1-407d-87ab-d7f546907220","Type":"ContainerStarted","Data":"124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43"} Mar 21 05:56:28 crc kubenswrapper[4775]: I0321 05:56:28.976427 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvhrr" podStartSLOduration=2.482502112 podStartE2EDuration="5.976405937s" podCreationTimestamp="2026-03-21 05:56:23 +0000 UTC" firstStartedPulling="2026-03-21 05:56:24.88177193 +0000 UTC m=+4137.858235554" lastFinishedPulling="2026-03-21 05:56:28.375675755 +0000 UTC m=+4141.352139379" observedRunningTime="2026-03-21 05:56:28.970483199 +0000 UTC m=+4141.946946823" watchObservedRunningTime="2026-03-21 05:56:28.976405937 +0000 UTC m=+4141.952869561" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.039210 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2gd5d/must-gather-fmnth"] Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.041878 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.043834 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2gd5d"/"openshift-service-ca.crt" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.043897 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2gd5d"/"default-dockercfg-hpcs9" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.045411 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2gd5d"/"kube-root-ca.crt" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.050624 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2gd5d/must-gather-fmnth"] Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.065896 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jqm\" (UniqueName: \"kubernetes.io/projected/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-kube-api-access-k5jqm\") pod \"must-gather-fmnth\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.065981 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-must-gather-output\") pod \"must-gather-fmnth\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.167381 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jqm\" (UniqueName: \"kubernetes.io/projected/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-kube-api-access-k5jqm\") pod \"must-gather-fmnth\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.167461 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-must-gather-output\") pod \"must-gather-fmnth\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.168036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-must-gather-output\") pod \"must-gather-fmnth\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.189548 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jqm\" (UniqueName: \"kubernetes.io/projected/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-kube-api-access-k5jqm\") pod \"must-gather-fmnth\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.362008 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.481797 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:56:32 crc kubenswrapper[4775]: I0321 05:56:32.481868 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:56:33 crc kubenswrapper[4775]: I0321 05:56:33.118563 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2gd5d/must-gather-fmnth"] Mar 21 05:56:33 crc kubenswrapper[4775]: I0321 05:56:33.582380 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:33 crc kubenswrapper[4775]: I0321 05:56:33.582647 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:33 crc kubenswrapper[4775]: I0321 05:56:33.651323 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:33 crc kubenswrapper[4775]: I0321 05:56:33.996180 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/must-gather-fmnth" event={"ID":"f8e9d6cc-a02d-4747-8f7c-f439b3227b11","Type":"ContainerStarted","Data":"7c0d546b19a6c80ff65ea248d24b3db62171d0f126c07845976d4d5ff49b7d79"} Mar 21 05:56:33 crc kubenswrapper[4775]: I0321 05:56:33.996512 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/must-gather-fmnth" event={"ID":"f8e9d6cc-a02d-4747-8f7c-f439b3227b11","Type":"ContainerStarted","Data":"9e98fbe03eab0e6c4baf6973b4872261d6f0c83d76001f189bcb6eab76312381"} Mar 21 05:56:33 crc kubenswrapper[4775]: I0321 05:56:33.996525 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/must-gather-fmnth" event={"ID":"f8e9d6cc-a02d-4747-8f7c-f439b3227b11","Type":"ContainerStarted","Data":"7c4f80f0df39580d8656dd019ad5ecda89f94b6664de1b8c9c62c10087b36fe2"} Mar 21 05:56:34 crc kubenswrapper[4775]: I0321 05:56:34.021067 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2gd5d/must-gather-fmnth" podStartSLOduration=2.021049874 podStartE2EDuration="2.021049874s" podCreationTimestamp="2026-03-21 05:56:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:56:34.013669974 +0000 UTC m=+4146.990133608" watchObservedRunningTime="2026-03-21 05:56:34.021049874 +0000 UTC m=+4146.997513498" Mar 21 05:56:34 crc kubenswrapper[4775]: I0321 05:56:34.657461 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:34 crc kubenswrapper[4775]: I0321 05:56:34.713242 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvhrr"] Mar 21 05:56:36 crc kubenswrapper[4775]: I0321 05:56:36.013017 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvhrr" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="registry-server" containerID="cri-o://124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43" gracePeriod=2 Mar 21 05:56:36 crc kubenswrapper[4775]: I0321 05:56:36.982091 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.025830 4775 generic.go:334] "Generic (PLEG): container finished" podID="a9132264-94c1-407d-87ab-d7f546907220" containerID="124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43" exitCode=0 Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.025888 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvhrr" event={"ID":"a9132264-94c1-407d-87ab-d7f546907220","Type":"ContainerDied","Data":"124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43"} Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.025921 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvhrr" event={"ID":"a9132264-94c1-407d-87ab-d7f546907220","Type":"ContainerDied","Data":"2585c889d397a5cc818abfd7d487d4c812588c6d0795639c8158b4cc4e2799db"} Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.025942 4775 scope.go:117] "RemoveContainer" containerID="124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.026167 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvhrr" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.049378 4775 scope.go:117] "RemoveContainer" containerID="db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.063941 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-catalog-content\") pod \"a9132264-94c1-407d-87ab-d7f546907220\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.064083 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-utilities\") pod \"a9132264-94c1-407d-87ab-d7f546907220\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.064327 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v54rl\" (UniqueName: \"kubernetes.io/projected/a9132264-94c1-407d-87ab-d7f546907220-kube-api-access-v54rl\") pod \"a9132264-94c1-407d-87ab-d7f546907220\" (UID: \"a9132264-94c1-407d-87ab-d7f546907220\") " Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.066686 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-utilities" (OuterVolumeSpecName: "utilities") pod "a9132264-94c1-407d-87ab-d7f546907220" (UID: "a9132264-94c1-407d-87ab-d7f546907220"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.076898 4775 scope.go:117] "RemoveContainer" containerID="56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.081724 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9132264-94c1-407d-87ab-d7f546907220-kube-api-access-v54rl" (OuterVolumeSpecName: "kube-api-access-v54rl") pod "a9132264-94c1-407d-87ab-d7f546907220" (UID: "a9132264-94c1-407d-87ab-d7f546907220"). InnerVolumeSpecName "kube-api-access-v54rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.126378 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9132264-94c1-407d-87ab-d7f546907220" (UID: "a9132264-94c1-407d-87ab-d7f546907220"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.151045 4775 scope.go:117] "RemoveContainer" containerID="124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43" Mar 21 05:56:37 crc kubenswrapper[4775]: E0321 05:56:37.151630 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43\": container with ID starting with 124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43 not found: ID does not exist" containerID="124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.151668 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43"} err="failed to get container status \"124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43\": rpc error: code = NotFound desc = could not find container \"124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43\": container with ID starting with 124837a8aeed307cbf3f0d2a7f042cf5856dac1eb3286c8f4ca6d73a2fd39f43 not found: ID does not exist" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.151695 4775 scope.go:117] "RemoveContainer" containerID="db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259" Mar 21 05:56:37 crc kubenswrapper[4775]: E0321 05:56:37.152296 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259\": container with ID starting with db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259 not found: ID does not exist" containerID="db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.152352 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259"} err="failed to get container status \"db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259\": rpc error: code = NotFound desc = could not find container \"db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259\": container with ID starting with db2adef6aa111c2105ea410f3c602299796961892b8bbcc0eaa8ea40de324259 not found: ID does not exist" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.152385 4775 scope.go:117] "RemoveContainer" containerID="56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf" Mar 21 05:56:37 crc kubenswrapper[4775]: E0321 05:56:37.152731 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf\": container with ID starting with 56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf not found: ID does not exist" containerID="56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.152766 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf"} err="failed to get container status \"56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf\": rpc error: code = NotFound desc = could not find container \"56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf\": container with ID starting with 56ec5eaa6447799381690c7e64db1dc2298dedf7d9f2b726e9f414fe1bc074cf not found: ID does not exist" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.167356 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.167409 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v54rl\" (UniqueName: \"kubernetes.io/projected/a9132264-94c1-407d-87ab-d7f546907220-kube-api-access-v54rl\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.167422 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9132264-94c1-407d-87ab-d7f546907220-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.319947 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-d75fs"] Mar 21 05:56:37 crc kubenswrapper[4775]: E0321 05:56:37.320394 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="extract-utilities" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.320413 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="extract-utilities" Mar 21 05:56:37 crc kubenswrapper[4775]: E0321 05:56:37.320430 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="extract-content" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.320439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="extract-content" Mar 21 05:56:37 crc kubenswrapper[4775]: E0321 05:56:37.320458 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="registry-server" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.320465 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="registry-server" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.320829 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9132264-94c1-407d-87ab-d7f546907220" containerName="registry-server" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.321494 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.365502 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvhrr"] Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.376087 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvhrr"] Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.473375 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl9hg\" (UniqueName: \"kubernetes.io/projected/02fd708d-0508-40e9-8591-02cc73141ca4-kube-api-access-hl9hg\") pod \"crc-debug-d75fs\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.473428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fd708d-0508-40e9-8591-02cc73141ca4-host\") pod \"crc-debug-d75fs\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.575691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl9hg\" (UniqueName: \"kubernetes.io/projected/02fd708d-0508-40e9-8591-02cc73141ca4-kube-api-access-hl9hg\") pod \"crc-debug-d75fs\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.575759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fd708d-0508-40e9-8591-02cc73141ca4-host\") pod \"crc-debug-d75fs\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.575984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fd708d-0508-40e9-8591-02cc73141ca4-host\") pod \"crc-debug-d75fs\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.598175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl9hg\" (UniqueName: \"kubernetes.io/projected/02fd708d-0508-40e9-8591-02cc73141ca4-kube-api-access-hl9hg\") pod \"crc-debug-d75fs\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.643399 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:56:37 crc kubenswrapper[4775]: I0321 05:56:37.675505 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9132264-94c1-407d-87ab-d7f546907220" path="/var/lib/kubelet/pods/a9132264-94c1-407d-87ab-d7f546907220/volumes" Mar 21 05:56:38 crc kubenswrapper[4775]: I0321 05:56:38.050409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" event={"ID":"02fd708d-0508-40e9-8591-02cc73141ca4","Type":"ContainerStarted","Data":"bd3bbff9eadc760f7629d28db8d7131fe45713ac32e2244fcd75d129dcb618b6"} Mar 21 05:56:39 crc kubenswrapper[4775]: I0321 05:56:39.062908 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" event={"ID":"02fd708d-0508-40e9-8591-02cc73141ca4","Type":"ContainerStarted","Data":"6458d3534ba667913e8d34c99b54452e526574493e208030a9412e60968bc790"} Mar 21 05:56:39 crc kubenswrapper[4775]: I0321 05:56:39.081345 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" podStartSLOduration=2.0813220550000002 podStartE2EDuration="2.081322055s" podCreationTimestamp="2026-03-21 05:56:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:56:39.079770141 +0000 UTC m=+4152.056233775" watchObservedRunningTime="2026-03-21 05:56:39.081322055 +0000 UTC m=+4152.057785679" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.170611 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hf79m"] Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.173991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.179823 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hf79m"] Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.309280 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc7n2\" (UniqueName: \"kubernetes.io/projected/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-kube-api-access-bc7n2\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.309339 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-utilities\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.309765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-catalog-content\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.411791 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc7n2\" (UniqueName: \"kubernetes.io/projected/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-kube-api-access-bc7n2\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.411864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-utilities\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.412019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-catalog-content\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.412436 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-utilities\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.412651 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-catalog-content\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.439071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc7n2\" (UniqueName: \"kubernetes.io/projected/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-kube-api-access-bc7n2\") pod \"redhat-marketplace-hf79m\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.502984 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:49 crc kubenswrapper[4775]: I0321 05:56:49.580816 4775 scope.go:117] "RemoveContainer" containerID="21e7854b61e199821209ba3f9487dda8cf82083673134efff058e8fd96d17f3f" Mar 21 05:56:50 crc kubenswrapper[4775]: I0321 05:56:50.098556 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hf79m"] Mar 21 05:56:50 crc kubenswrapper[4775]: I0321 05:56:50.169342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hf79m" event={"ID":"8022fa19-b95f-4ac2-a8cc-6a47a874fd34","Type":"ContainerStarted","Data":"6119ecf3284a2c38c85c0126ae363c2d0c4d602c9e56e0ece76b212ea792f2ac"} Mar 21 05:56:51 crc kubenswrapper[4775]: I0321 05:56:51.180987 4775 generic.go:334] "Generic (PLEG): container finished" podID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerID="b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89" exitCode=0 Mar 21 05:56:51 crc kubenswrapper[4775]: I0321 05:56:51.181096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hf79m" event={"ID":"8022fa19-b95f-4ac2-a8cc-6a47a874fd34","Type":"ContainerDied","Data":"b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89"} Mar 21 05:56:52 crc kubenswrapper[4775]: I0321 05:56:52.192572 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hf79m" event={"ID":"8022fa19-b95f-4ac2-a8cc-6a47a874fd34","Type":"ContainerStarted","Data":"26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37"} Mar 21 05:56:54 crc kubenswrapper[4775]: I0321 05:56:54.225545 4775 generic.go:334] "Generic (PLEG): container finished" podID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerID="26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37" exitCode=0 Mar 21 05:56:54 crc kubenswrapper[4775]: I0321 05:56:54.225632 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hf79m" event={"ID":"8022fa19-b95f-4ac2-a8cc-6a47a874fd34","Type":"ContainerDied","Data":"26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37"} Mar 21 05:56:56 crc kubenswrapper[4775]: I0321 05:56:56.252763 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hf79m" event={"ID":"8022fa19-b95f-4ac2-a8cc-6a47a874fd34","Type":"ContainerStarted","Data":"388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c"} Mar 21 05:56:56 crc kubenswrapper[4775]: I0321 05:56:56.280717 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hf79m" podStartSLOduration=3.448545967 podStartE2EDuration="7.280693148s" podCreationTimestamp="2026-03-21 05:56:49 +0000 UTC" firstStartedPulling="2026-03-21 05:56:51.184549508 +0000 UTC m=+4164.161013132" lastFinishedPulling="2026-03-21 05:56:55.016696689 +0000 UTC m=+4167.993160313" observedRunningTime="2026-03-21 05:56:56.275880162 +0000 UTC m=+4169.252343796" watchObservedRunningTime="2026-03-21 05:56:56.280693148 +0000 UTC m=+4169.257156782" Mar 21 05:56:59 crc kubenswrapper[4775]: I0321 05:56:59.503944 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:59 crc kubenswrapper[4775]: I0321 05:56:59.504413 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:56:59 crc kubenswrapper[4775]: I0321 05:56:59.557351 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:57:00 crc kubenswrapper[4775]: I0321 05:57:00.337895 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:57:00 crc kubenswrapper[4775]: I0321 05:57:00.408220 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hf79m"] Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.301218 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hf79m" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="registry-server" containerID="cri-o://388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c" gracePeriod=2 Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.482595 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.482650 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.875416 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.927045 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-utilities" (OuterVolumeSpecName: "utilities") pod "8022fa19-b95f-4ac2-a8cc-6a47a874fd34" (UID: "8022fa19-b95f-4ac2-a8cc-6a47a874fd34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.927418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-utilities\") pod \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.927597 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-catalog-content\") pod \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.927651 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc7n2\" (UniqueName: \"kubernetes.io/projected/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-kube-api-access-bc7n2\") pod \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\" (UID: \"8022fa19-b95f-4ac2-a8cc-6a47a874fd34\") " Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.928056 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:02 crc kubenswrapper[4775]: I0321 05:57:02.963008 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8022fa19-b95f-4ac2-a8cc-6a47a874fd34" (UID: "8022fa19-b95f-4ac2-a8cc-6a47a874fd34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.029695 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.312722 4775 generic.go:334] "Generic (PLEG): container finished" podID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerID="388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c" exitCode=0 Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.312766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hf79m" event={"ID":"8022fa19-b95f-4ac2-a8cc-6a47a874fd34","Type":"ContainerDied","Data":"388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c"} Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.312799 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hf79m" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.312824 4775 scope.go:117] "RemoveContainer" containerID="388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.312810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hf79m" event={"ID":"8022fa19-b95f-4ac2-a8cc-6a47a874fd34","Type":"ContainerDied","Data":"6119ecf3284a2c38c85c0126ae363c2d0c4d602c9e56e0ece76b212ea792f2ac"} Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.338412 4775 scope.go:117] "RemoveContainer" containerID="26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.500384 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-kube-api-access-bc7n2" (OuterVolumeSpecName: "kube-api-access-bc7n2") pod "8022fa19-b95f-4ac2-a8cc-6a47a874fd34" (UID: "8022fa19-b95f-4ac2-a8cc-6a47a874fd34"). InnerVolumeSpecName "kube-api-access-bc7n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.537742 4775 scope.go:117] "RemoveContainer" containerID="b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.538507 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc7n2\" (UniqueName: \"kubernetes.io/projected/8022fa19-b95f-4ac2-a8cc-6a47a874fd34-kube-api-access-bc7n2\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.614626 4775 scope.go:117] "RemoveContainer" containerID="388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c" Mar 21 05:57:03 crc kubenswrapper[4775]: E0321 05:57:03.615052 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c\": container with ID starting with 388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c not found: ID does not exist" containerID="388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.615100 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c"} err="failed to get container status \"388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c\": rpc error: code = NotFound desc = could not find container \"388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c\": container with ID starting with 388eee5beefb800def090d2007d8dfc0e1863d473480f91a687b60b9e0f1893c not found: ID does not exist" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.615142 4775 scope.go:117] "RemoveContainer" containerID="26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37" Mar 21 05:57:03 crc kubenswrapper[4775]: E0321 05:57:03.615451 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37\": container with ID starting with 26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37 not found: ID does not exist" containerID="26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.615582 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37"} err="failed to get container status \"26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37\": rpc error: code = NotFound desc = could not find container \"26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37\": container with ID starting with 26d7e2528fec25cd60be964aaf6b1f8773399257833c635b71323a506857fd37 not found: ID does not exist" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.615916 4775 scope.go:117] "RemoveContainer" containerID="b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89" Mar 21 05:57:03 crc kubenswrapper[4775]: E0321 05:57:03.616398 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89\": container with ID starting with b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89 not found: ID does not exist" containerID="b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.616435 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89"} err="failed to get container status \"b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89\": rpc error: code = NotFound desc = could not find container \"b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89\": container with ID starting with b49b1a271ad4bd5b721f831a551f4d151d0600192cfd36b385e5c58ea9cc4d89 not found: ID does not exist" Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.682052 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hf79m"] Mar 21 05:57:03 crc kubenswrapper[4775]: I0321 05:57:03.707515 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hf79m"] Mar 21 05:57:05 crc kubenswrapper[4775]: I0321 05:57:05.674032 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" path="/var/lib/kubelet/pods/8022fa19-b95f-4ac2-a8cc-6a47a874fd34/volumes" Mar 21 05:57:16 crc kubenswrapper[4775]: I0321 05:57:16.448886 4775 generic.go:334] "Generic (PLEG): container finished" podID="02fd708d-0508-40e9-8591-02cc73141ca4" containerID="6458d3534ba667913e8d34c99b54452e526574493e208030a9412e60968bc790" exitCode=0 Mar 21 05:57:16 crc kubenswrapper[4775]: I0321 05:57:16.448989 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" event={"ID":"02fd708d-0508-40e9-8591-02cc73141ca4","Type":"ContainerDied","Data":"6458d3534ba667913e8d34c99b54452e526574493e208030a9412e60968bc790"} Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.589211 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.602431 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl9hg\" (UniqueName: \"kubernetes.io/projected/02fd708d-0508-40e9-8591-02cc73141ca4-kube-api-access-hl9hg\") pod \"02fd708d-0508-40e9-8591-02cc73141ca4\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.602547 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fd708d-0508-40e9-8591-02cc73141ca4-host\") pod \"02fd708d-0508-40e9-8591-02cc73141ca4\" (UID: \"02fd708d-0508-40e9-8591-02cc73141ca4\") " Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.602952 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02fd708d-0508-40e9-8591-02cc73141ca4-host" (OuterVolumeSpecName: "host") pod "02fd708d-0508-40e9-8591-02cc73141ca4" (UID: "02fd708d-0508-40e9-8591-02cc73141ca4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.628319 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02fd708d-0508-40e9-8591-02cc73141ca4-kube-api-access-hl9hg" (OuterVolumeSpecName: "kube-api-access-hl9hg") pod "02fd708d-0508-40e9-8591-02cc73141ca4" (UID: "02fd708d-0508-40e9-8591-02cc73141ca4"). InnerVolumeSpecName "kube-api-access-hl9hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.636423 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-d75fs"] Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.648730 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-d75fs"] Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.674376 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02fd708d-0508-40e9-8591-02cc73141ca4" path="/var/lib/kubelet/pods/02fd708d-0508-40e9-8591-02cc73141ca4/volumes" Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.714830 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl9hg\" (UniqueName: \"kubernetes.io/projected/02fd708d-0508-40e9-8591-02cc73141ca4-kube-api-access-hl9hg\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:17 crc kubenswrapper[4775]: I0321 05:57:17.714870 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fd708d-0508-40e9-8591-02cc73141ca4-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.476590 4775 scope.go:117] "RemoveContainer" containerID="6458d3534ba667913e8d34c99b54452e526574493e208030a9412e60968bc790" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.476818 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-d75fs" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.932980 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-8rplg"] Mar 21 05:57:18 crc kubenswrapper[4775]: E0321 05:57:18.933787 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="extract-content" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.933799 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="extract-content" Mar 21 05:57:18 crc kubenswrapper[4775]: E0321 05:57:18.933815 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02fd708d-0508-40e9-8591-02cc73141ca4" containerName="container-00" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.933823 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fd708d-0508-40e9-8591-02cc73141ca4" containerName="container-00" Mar 21 05:57:18 crc kubenswrapper[4775]: E0321 05:57:18.933843 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="registry-server" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.933851 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="registry-server" Mar 21 05:57:18 crc kubenswrapper[4775]: E0321 05:57:18.933871 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="extract-utilities" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.933876 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="extract-utilities" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.934033 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8022fa19-b95f-4ac2-a8cc-6a47a874fd34" containerName="registry-server" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.934055 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="02fd708d-0508-40e9-8591-02cc73141ca4" containerName="container-00" Mar 21 05:57:18 crc kubenswrapper[4775]: I0321 05:57:18.935005 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.033902 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90863e29-81bb-477f-846a-7f9c27ed3b89-host\") pod \"crc-debug-8rplg\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.034351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrj8\" (UniqueName: \"kubernetes.io/projected/90863e29-81bb-477f-846a-7f9c27ed3b89-kube-api-access-mlrj8\") pod \"crc-debug-8rplg\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.135849 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrj8\" (UniqueName: \"kubernetes.io/projected/90863e29-81bb-477f-846a-7f9c27ed3b89-kube-api-access-mlrj8\") pod \"crc-debug-8rplg\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.136311 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90863e29-81bb-477f-846a-7f9c27ed3b89-host\") pod \"crc-debug-8rplg\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.136425 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90863e29-81bb-477f-846a-7f9c27ed3b89-host\") pod \"crc-debug-8rplg\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.160251 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrj8\" (UniqueName: \"kubernetes.io/projected/90863e29-81bb-477f-846a-7f9c27ed3b89-kube-api-access-mlrj8\") pod \"crc-debug-8rplg\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.291464 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:19 crc kubenswrapper[4775]: I0321 05:57:19.486670 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/crc-debug-8rplg" event={"ID":"90863e29-81bb-477f-846a-7f9c27ed3b89","Type":"ContainerStarted","Data":"a273f01b54aa2eb40ad9286487d5ace47db2a7af3a064d365f343f92de6cf798"} Mar 21 05:57:20 crc kubenswrapper[4775]: I0321 05:57:20.506976 4775 generic.go:334] "Generic (PLEG): container finished" podID="90863e29-81bb-477f-846a-7f9c27ed3b89" containerID="9b324351e03c9b33d2013ebe58b0174e6c17561e8f6660c56e5da466bb4b8c3d" exitCode=0 Mar 21 05:57:20 crc kubenswrapper[4775]: I0321 05:57:20.507693 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/crc-debug-8rplg" event={"ID":"90863e29-81bb-477f-846a-7f9c27ed3b89","Type":"ContainerDied","Data":"9b324351e03c9b33d2013ebe58b0174e6c17561e8f6660c56e5da466bb4b8c3d"} Mar 21 05:57:20 crc kubenswrapper[4775]: I0321 05:57:20.980326 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-8rplg"] Mar 21 05:57:20 crc kubenswrapper[4775]: I0321 05:57:20.994145 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-8rplg"] Mar 21 05:57:21 crc kubenswrapper[4775]: I0321 05:57:21.658456 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:21 crc kubenswrapper[4775]: I0321 05:57:21.792376 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90863e29-81bb-477f-846a-7f9c27ed3b89-host\") pod \"90863e29-81bb-477f-846a-7f9c27ed3b89\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " Mar 21 05:57:21 crc kubenswrapper[4775]: I0321 05:57:21.792467 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90863e29-81bb-477f-846a-7f9c27ed3b89-host" (OuterVolumeSpecName: "host") pod "90863e29-81bb-477f-846a-7f9c27ed3b89" (UID: "90863e29-81bb-477f-846a-7f9c27ed3b89"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:57:21 crc kubenswrapper[4775]: I0321 05:57:21.792873 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrj8\" (UniqueName: \"kubernetes.io/projected/90863e29-81bb-477f-846a-7f9c27ed3b89-kube-api-access-mlrj8\") pod \"90863e29-81bb-477f-846a-7f9c27ed3b89\" (UID: \"90863e29-81bb-477f-846a-7f9c27ed3b89\") " Mar 21 05:57:21 crc kubenswrapper[4775]: I0321 05:57:21.793337 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90863e29-81bb-477f-846a-7f9c27ed3b89-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:21 crc kubenswrapper[4775]: I0321 05:57:21.802441 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90863e29-81bb-477f-846a-7f9c27ed3b89-kube-api-access-mlrj8" (OuterVolumeSpecName: "kube-api-access-mlrj8") pod "90863e29-81bb-477f-846a-7f9c27ed3b89" (UID: "90863e29-81bb-477f-846a-7f9c27ed3b89"). InnerVolumeSpecName "kube-api-access-mlrj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:57:21 crc kubenswrapper[4775]: I0321 05:57:21.895862 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlrj8\" (UniqueName: \"kubernetes.io/projected/90863e29-81bb-477f-846a-7f9c27ed3b89-kube-api-access-mlrj8\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.188557 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-xh575"] Mar 21 05:57:22 crc kubenswrapper[4775]: E0321 05:57:22.189099 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90863e29-81bb-477f-846a-7f9c27ed3b89" containerName="container-00" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.189138 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="90863e29-81bb-477f-846a-7f9c27ed3b89" containerName="container-00" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.189351 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="90863e29-81bb-477f-846a-7f9c27ed3b89" containerName="container-00" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.190148 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.207240 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7e5b9b-4224-400b-b466-98e72682898e-host\") pod \"crc-debug-xh575\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.207432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2s5\" (UniqueName: \"kubernetes.io/projected/6f7e5b9b-4224-400b-b466-98e72682898e-kube-api-access-6q2s5\") pod \"crc-debug-xh575\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.308188 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7e5b9b-4224-400b-b466-98e72682898e-host\") pod \"crc-debug-xh575\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.308361 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7e5b9b-4224-400b-b466-98e72682898e-host\") pod \"crc-debug-xh575\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.308421 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2s5\" (UniqueName: \"kubernetes.io/projected/6f7e5b9b-4224-400b-b466-98e72682898e-kube-api-access-6q2s5\") pod \"crc-debug-xh575\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.326635 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2s5\" (UniqueName: \"kubernetes.io/projected/6f7e5b9b-4224-400b-b466-98e72682898e-kube-api-access-6q2s5\") pod \"crc-debug-xh575\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.511435 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.529156 4775 scope.go:117] "RemoveContainer" containerID="9b324351e03c9b33d2013ebe58b0174e6c17561e8f6660c56e5da466bb4b8c3d" Mar 21 05:57:22 crc kubenswrapper[4775]: I0321 05:57:22.529203 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-8rplg" Mar 21 05:57:23 crc kubenswrapper[4775]: I0321 05:57:23.538256 4775 generic.go:334] "Generic (PLEG): container finished" podID="6f7e5b9b-4224-400b-b466-98e72682898e" containerID="695f561171dabe615374495a7ceda94563edda67471d58dbff6e220446956791" exitCode=0 Mar 21 05:57:23 crc kubenswrapper[4775]: I0321 05:57:23.538335 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/crc-debug-xh575" event={"ID":"6f7e5b9b-4224-400b-b466-98e72682898e","Type":"ContainerDied","Data":"695f561171dabe615374495a7ceda94563edda67471d58dbff6e220446956791"} Mar 21 05:57:23 crc kubenswrapper[4775]: I0321 05:57:23.538926 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/crc-debug-xh575" event={"ID":"6f7e5b9b-4224-400b-b466-98e72682898e","Type":"ContainerStarted","Data":"0433825eda899e4fbbddf6ea9298598eaa49439fe9100da79b0d50995098bcac"} Mar 21 05:57:23 crc kubenswrapper[4775]: I0321 05:57:23.581400 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-xh575"] Mar 21 05:57:23 crc kubenswrapper[4775]: I0321 05:57:23.594196 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2gd5d/crc-debug-xh575"] Mar 21 05:57:23 crc kubenswrapper[4775]: I0321 05:57:23.674624 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90863e29-81bb-477f-846a-7f9c27ed3b89" path="/var/lib/kubelet/pods/90863e29-81bb-477f-846a-7f9c27ed3b89/volumes" Mar 21 05:57:24 crc kubenswrapper[4775]: I0321 05:57:24.651950 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:24 crc kubenswrapper[4775]: I0321 05:57:24.755682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7e5b9b-4224-400b-b466-98e72682898e-host\") pod \"6f7e5b9b-4224-400b-b466-98e72682898e\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " Mar 21 05:57:24 crc kubenswrapper[4775]: I0321 05:57:24.755806 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q2s5\" (UniqueName: \"kubernetes.io/projected/6f7e5b9b-4224-400b-b466-98e72682898e-kube-api-access-6q2s5\") pod \"6f7e5b9b-4224-400b-b466-98e72682898e\" (UID: \"6f7e5b9b-4224-400b-b466-98e72682898e\") " Mar 21 05:57:24 crc kubenswrapper[4775]: I0321 05:57:24.755824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f7e5b9b-4224-400b-b466-98e72682898e-host" (OuterVolumeSpecName: "host") pod "6f7e5b9b-4224-400b-b466-98e72682898e" (UID: "6f7e5b9b-4224-400b-b466-98e72682898e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:57:24 crc kubenswrapper[4775]: I0321 05:57:24.756814 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f7e5b9b-4224-400b-b466-98e72682898e-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:24 crc kubenswrapper[4775]: I0321 05:57:24.776972 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7e5b9b-4224-400b-b466-98e72682898e-kube-api-access-6q2s5" (OuterVolumeSpecName: "kube-api-access-6q2s5") pod "6f7e5b9b-4224-400b-b466-98e72682898e" (UID: "6f7e5b9b-4224-400b-b466-98e72682898e"). InnerVolumeSpecName "kube-api-access-6q2s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:57:24 crc kubenswrapper[4775]: I0321 05:57:24.858779 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q2s5\" (UniqueName: \"kubernetes.io/projected/6f7e5b9b-4224-400b-b466-98e72682898e-kube-api-access-6q2s5\") on node \"crc\" DevicePath \"\"" Mar 21 05:57:25 crc kubenswrapper[4775]: I0321 05:57:25.557674 4775 scope.go:117] "RemoveContainer" containerID="695f561171dabe615374495a7ceda94563edda67471d58dbff6e220446956791" Mar 21 05:57:25 crc kubenswrapper[4775]: I0321 05:57:25.557722 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/crc-debug-xh575" Mar 21 05:57:25 crc kubenswrapper[4775]: I0321 05:57:25.671156 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f7e5b9b-4224-400b-b466-98e72682898e" path="/var/lib/kubelet/pods/6f7e5b9b-4224-400b-b466-98e72682898e/volumes" Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.482191 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.482832 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.482885 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.483804 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb650740af569c16423f24bd3abda67b78d6d33afa7ae7aa33a8a75f71e85f1c"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.483865 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://fb650740af569c16423f24bd3abda67b78d6d33afa7ae7aa33a8a75f71e85f1c" gracePeriod=600 Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.643909 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="fb650740af569c16423f24bd3abda67b78d6d33afa7ae7aa33a8a75f71e85f1c" exitCode=0 Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.643974 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"fb650740af569c16423f24bd3abda67b78d6d33afa7ae7aa33a8a75f71e85f1c"} Mar 21 05:57:32 crc kubenswrapper[4775]: I0321 05:57:32.644020 4775 scope.go:117] "RemoveContainer" containerID="fb52ad12aca7a430a7e7f10e8a7f0c10de8bb4e4ef886aa1ca0cb8ffad7427f2" Mar 21 05:57:33 crc kubenswrapper[4775]: I0321 05:57:33.653866 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226"} Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.038425 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77fd86567d-mf2wb_1727040d-36f5-431c-b8f1-84e206146dcf/barbican-api/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.156672 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567878-jj8nx"] Mar 21 05:58:00 crc kubenswrapper[4775]: E0321 05:58:00.157318 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7e5b9b-4224-400b-b466-98e72682898e" containerName="container-00" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.157341 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7e5b9b-4224-400b-b466-98e72682898e" containerName="container-00" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.157572 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7e5b9b-4224-400b-b466-98e72682898e" containerName="container-00" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.158381 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-jj8nx" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.159928 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77fd86567d-mf2wb_1727040d-36f5-431c-b8f1-84e206146dcf/barbican-api-log/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.169911 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-jj8nx"] Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.171266 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.171517 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.171903 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.251343 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5btc\" (UniqueName: \"kubernetes.io/projected/7ad9f411-868c-4132-a586-c0f66e18edbc-kube-api-access-j5btc\") pod \"auto-csr-approver-29567878-jj8nx\" (UID: \"7ad9f411-868c-4132-a586-c0f66e18edbc\") " pod="openshift-infra/auto-csr-approver-29567878-jj8nx" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.305724 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f5b547c8-mgjw5_177228f6-7f69-49c2-9942-ea0a98b56d13/barbican-keystone-listener/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.353543 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5btc\" (UniqueName: \"kubernetes.io/projected/7ad9f411-868c-4132-a586-c0f66e18edbc-kube-api-access-j5btc\") pod \"auto-csr-approver-29567878-jj8nx\" (UID: \"7ad9f411-868c-4132-a586-c0f66e18edbc\") " pod="openshift-infra/auto-csr-approver-29567878-jj8nx" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.372680 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75f5b547c8-mgjw5_177228f6-7f69-49c2-9942-ea0a98b56d13/barbican-keystone-listener-log/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.376197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5btc\" (UniqueName: \"kubernetes.io/projected/7ad9f411-868c-4132-a586-c0f66e18edbc-kube-api-access-j5btc\") pod \"auto-csr-approver-29567878-jj8nx\" (UID: \"7ad9f411-868c-4132-a586-c0f66e18edbc\") " pod="openshift-infra/auto-csr-approver-29567878-jj8nx" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.477408 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6dd9f89d55-fdf8c_d8a7c2e5-3643-4675-9888-3c310e4f9ad4/barbican-worker/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.491543 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-jj8nx" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.535375 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6dd9f89d55-fdf8c_d8a7c2e5-3643-4675-9888-3c310e4f9ad4/barbican-worker-log/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.852635 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-z5v8m_203df932-0574-4098-b897-ba50813f2ec1/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.859942 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/ceilometer-central-agent/0.log" Mar 21 05:58:00 crc kubenswrapper[4775]: I0321 05:58:00.989806 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/ceilometer-notification-agent/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.014650 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/proxy-httpd/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.038649 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-jj8nx"] Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.112255 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_15d97495-428d-47e0-a115-99c7fd08850a/sg-core/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.261847 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c16d835a-1ec2-473d-b2d8-c8e7c978e140/cinder-api/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.285874 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c16d835a-1ec2-473d-b2d8-c8e7c978e140/cinder-api-log/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.549032 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d02bb319-292e-449c-8e5b-42c6859f1529/probe/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.551365 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_d02bb319-292e-449c-8e5b-42c6859f1529/cinder-scheduler/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.738952 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2d2z7_5efe4255-484c-47d7-800a-4d0dbc5cecd9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.901101 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ddjjn_8d4d2c78-9cdd-4623-9fdb-e277ce4d85bd/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.941067 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567878-jj8nx" event={"ID":"7ad9f411-868c-4132-a586-c0f66e18edbc","Type":"ContainerStarted","Data":"1fc6d0c3fd1580c0dedc00dd7614c3ddb4d963fc900eb40dcdffbf67947f04b0"} Mar 21 05:58:01 crc kubenswrapper[4775]: I0321 05:58:01.988754 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-mmgp9_4918607c-6074-4fb3-a0a0-8def479058a0/init/0.log" Mar 21 05:58:02 crc kubenswrapper[4775]: I0321 05:58:02.168606 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-mmgp9_4918607c-6074-4fb3-a0a0-8def479058a0/init/0.log" Mar 21 05:58:02 crc kubenswrapper[4775]: I0321 05:58:02.246502 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-mmgp9_4918607c-6074-4fb3-a0a0-8def479058a0/dnsmasq-dns/0.log" Mar 21 05:58:02 crc kubenswrapper[4775]: I0321 05:58:02.357080 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-h87nr_28040d61-c9ea-4a55-b113-db871dff679c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:02 crc kubenswrapper[4775]: I0321 05:58:02.500315 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_01638b90-5e17-43b3-a3b5-90726b26e243/glance-httpd/0.log" Mar 21 05:58:02 crc kubenswrapper[4775]: I0321 05:58:02.545395 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_01638b90-5e17-43b3-a3b5-90726b26e243/glance-log/0.log" Mar 21 05:58:02 crc kubenswrapper[4775]: I0321 05:58:02.678675 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9e72c4b-a3fb-41eb-974a-74d24d6cdac9/glance-httpd/0.log" Mar 21 05:58:02 crc kubenswrapper[4775]: I0321 05:58:02.712172 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9e72c4b-a3fb-41eb-974a-74d24d6cdac9/glance-log/0.log" Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.049429 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6496ddbdd4-v5mc5_fc6e433f-9e70-4b09-9780-403634bbe0dc/horizon/0.log" Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.199267 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jmx8f_8390751b-3911-4a24-a1a2-c3d1d10da875/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.317012 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6496ddbdd4-v5mc5_fc6e433f-9e70-4b09-9780-403634bbe0dc/horizon-log/0.log" Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.613689 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kfrz2_dbd64e65-be8d-42e7-a686-d5454932156d/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.752489 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7ee3b7a0-9eb3-4702-8fb7-3286df60b21b/kube-state-metrics/0.log" Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.896256 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-66c879cfdd-smnxp_9f965feb-5d82-4176-a14d-08a84c4ae794/keystone-api/0.log" Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.965143 4775 generic.go:334] "Generic (PLEG): container finished" podID="7ad9f411-868c-4132-a586-c0f66e18edbc" containerID="a70b4f1d1db681654636f9ae7783b422ab2ef1d53af567edd94ba3f694c4a39b" exitCode=0 Mar 21 05:58:03 crc kubenswrapper[4775]: I0321 05:58:03.965190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567878-jj8nx" event={"ID":"7ad9f411-868c-4132-a586-c0f66e18edbc","Type":"ContainerDied","Data":"a70b4f1d1db681654636f9ae7783b422ab2ef1d53af567edd94ba3f694c4a39b"} Mar 21 05:58:04 crc kubenswrapper[4775]: I0321 05:58:04.414534 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d65998c7c-prp5b_3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef/neutron-api/0.log" Mar 21 05:58:04 crc kubenswrapper[4775]: I0321 05:58:04.432641 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7d65998c7c-prp5b_3c1e6a0a-1e89-4c30-8d9e-57306d73e4ef/neutron-httpd/0.log" Mar 21 05:58:04 crc kubenswrapper[4775]: I0321 05:58:04.706962 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-n2jxz_a71dcc90-c70a-4ff8-bf4a-42f1a2415827/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:04 crc kubenswrapper[4775]: I0321 05:58:04.786739 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-z7cd9_29e22bcc-ed74-4093-9cf8-f7a2e3ab9c70/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.420528 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-jj8nx" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.492666 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_789a0bb8-b131-4144-9400-7c32a604d6d5/nova-cell0-conductor-conductor/0.log" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.517876 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_886c404c-ceec-48e7-90da-96d6aa201152/nova-api-log/0.log" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.558128 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5btc\" (UniqueName: \"kubernetes.io/projected/7ad9f411-868c-4132-a586-c0f66e18edbc-kube-api-access-j5btc\") pod \"7ad9f411-868c-4132-a586-c0f66e18edbc\" (UID: \"7ad9f411-868c-4132-a586-c0f66e18edbc\") " Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.570529 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad9f411-868c-4132-a586-c0f66e18edbc-kube-api-access-j5btc" (OuterVolumeSpecName: "kube-api-access-j5btc") pod "7ad9f411-868c-4132-a586-c0f66e18edbc" (UID: "7ad9f411-868c-4132-a586-c0f66e18edbc"). InnerVolumeSpecName "kube-api-access-j5btc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.665743 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5btc\" (UniqueName: \"kubernetes.io/projected/7ad9f411-868c-4132-a586-c0f66e18edbc-kube-api-access-j5btc\") on node \"crc\" DevicePath \"\"" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.842707 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b29323a7-476f-4a13-8085-4b2158a68850/nova-cell1-conductor-conductor/0.log" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.966541 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3b7ea443-e30d-41d1-9f42-0bef9d7bd012/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.985191 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567878-jj8nx" Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.985172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567878-jj8nx" event={"ID":"7ad9f411-868c-4132-a586-c0f66e18edbc","Type":"ContainerDied","Data":"1fc6d0c3fd1580c0dedc00dd7614c3ddb4d963fc900eb40dcdffbf67947f04b0"} Mar 21 05:58:05 crc kubenswrapper[4775]: I0321 05:58:05.985963 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fc6d0c3fd1580c0dedc00dd7614c3ddb4d963fc900eb40dcdffbf67947f04b0" Mar 21 05:58:06 crc kubenswrapper[4775]: I0321 05:58:06.181070 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_886c404c-ceec-48e7-90da-96d6aa201152/nova-api-api/0.log" Mar 21 05:58:06 crc kubenswrapper[4775]: I0321 05:58:06.506896 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-ctnt2"] Mar 21 05:58:06 crc kubenswrapper[4775]: I0321 05:58:06.515982 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567872-ctnt2"] Mar 21 05:58:06 crc kubenswrapper[4775]: I0321 05:58:06.533017 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_208cfa71-8242-4958-b9db-21fc180a6697/nova-metadata-log/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.017939 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-t85b7_50003f97-774c-4321-9ddf-6ac67546b19f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.163549 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bab55731-40da-4831-a8b5-f9c413452367/nova-scheduler-scheduler/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.182726 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bade9789-f227-44ab-b7fa-2173445cd381/mysql-bootstrap/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.208005 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_208cfa71-8242-4958-b9db-21fc180a6697/nova-metadata-metadata/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.466894 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_76d205b7-bc2e-4dad-b513-457ff20d67e1/mysql-bootstrap/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.533194 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bade9789-f227-44ab-b7fa-2173445cd381/mysql-bootstrap/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.542385 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bade9789-f227-44ab-b7fa-2173445cd381/galera/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.683217 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19b93e1-3fea-4999-8e31-ba5cc64c91ea" path="/var/lib/kubelet/pods/b19b93e1-3fea-4999-8e31-ba5cc64c91ea/volumes" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.741085 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_76d205b7-bc2e-4dad-b513-457ff20d67e1/galera/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.760558 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_76d205b7-bc2e-4dad-b513-457ff20d67e1/mysql-bootstrap/0.log" Mar 21 05:58:07 crc kubenswrapper[4775]: I0321 05:58:07.800020 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_bb5a0456-b5c5-433a-afde-fe38740e2310/openstackclient/0.log" Mar 21 05:58:08 crc kubenswrapper[4775]: I0321 05:58:08.622553 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nmtjx_8a8e948c-2978-40c8-961b-1b010f7ea920/ovn-controller/0.log" Mar 21 05:58:08 crc kubenswrapper[4775]: I0321 05:58:08.667817 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ckh8b_9f8661f4-e47a-4fcc-b55e-4c1eecd0dac8/openstack-network-exporter/0.log" Mar 21 05:58:08 crc kubenswrapper[4775]: I0321 05:58:08.884359 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovsdb-server-init/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.083628 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovs-vswitchd/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.148425 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovsdb-server/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.173965 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-frhpj_88a367f7-4951-4e7c-889f-d147676654f8/ovsdb-server-init/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.449417 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_334a5c95-becc-4389-bb6f-50e5957cded6/openstack-network-exporter/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.467372 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-45kkj_3a3b7bfd-72cf-4ae1-9c2b-8d61e6af8e5d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.491950 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_334a5c95-becc-4389-bb6f-50e5957cded6/ovn-northd/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.665557 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e/openstack-network-exporter/0.log" Mar 21 05:58:09 crc kubenswrapper[4775]: I0321 05:58:09.772448 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b4d6ed0c-fd91-4ed6-9a1c-0a31f78ed48e/ovsdbserver-nb/0.log" Mar 21 05:58:10 crc kubenswrapper[4775]: I0321 05:58:10.470880 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58948d8bb4-rcw89_279dff90-9d39-418a-b5e7-00333a376d16/placement-api/0.log" Mar 21 05:58:10 crc kubenswrapper[4775]: I0321 05:58:10.481812 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93886182-fca2-42a9-a134-2243c7c7073d/openstack-network-exporter/0.log" Mar 21 05:58:10 crc kubenswrapper[4775]: I0321 05:58:10.513226 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_93886182-fca2-42a9-a134-2243c7c7073d/ovsdbserver-sb/0.log" Mar 21 05:58:10 crc kubenswrapper[4775]: I0321 05:58:10.804429 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c95486b5-f2ad-4098-912d-6749b329824b/setup-container/0.log" Mar 21 05:58:10 crc kubenswrapper[4775]: I0321 05:58:10.810972 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-58948d8bb4-rcw89_279dff90-9d39-418a-b5e7-00333a376d16/placement-log/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.006470 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c95486b5-f2ad-4098-912d-6749b329824b/rabbitmq/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.058634 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e5e83941-a38d-4ee9-b967-1dac69c5a55b/setup-container/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.079375 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c95486b5-f2ad-4098-912d-6749b329824b/setup-container/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.350829 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e5e83941-a38d-4ee9-b967-1dac69c5a55b/setup-container/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.375686 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-zxxq6_034f630d-d6d6-41f0-8df6-e5db37b778f3/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.423289 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e5e83941-a38d-4ee9-b967-1dac69c5a55b/rabbitmq/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.626506 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-c2q8m_f41367b2-433d-48f7-af75-575be4b318fc/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.652886 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qgpzc_7c88e417-9ede-41d8-8337-79620ceb7798/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:11 crc kubenswrapper[4775]: I0321 05:58:11.961335 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-wtk52_8cf3d7cc-425b-4d40-a26c-a88d2b210c0a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.072140 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-swg9k_5f45d376-4f59-4584-a545-16d4ff066232/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.289054 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f9b88fb79-vclnv_77432545-f22c-453a-b6a7-7c932712efa9/proxy-server/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.373911 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f9b88fb79-vclnv_77432545-f22c-453a-b6a7-7c932712efa9/proxy-httpd/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.456178 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kzll6_9e521c27-9d67-47bc-b6ac-74fabb543d3f/swift-ring-rebalance/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.574570 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-auditor/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.636293 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-reaper/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.736284 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-replicator/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.843735 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/account-server/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.890606 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-auditor/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.912493 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-replicator/0.log" Mar 21 05:58:12 crc kubenswrapper[4775]: I0321 05:58:12.961290 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-server/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.080917 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/container-updater/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.120709 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-auditor/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.132734 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-expirer/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.226136 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-replicator/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.406857 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-server/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.414483 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/object-updater/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.429059 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/rsync/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.503642 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8e93b938-c138-4cfc-a227-e1cd648ad59a/swift-recon-cron/0.log" Mar 21 05:58:13 crc kubenswrapper[4775]: I0321 05:58:13.748364 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1c832898-838d-423d-8ad8-512c5ee5706c/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:58:14 crc kubenswrapper[4775]: I0321 05:58:14.157148 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2c99faa4-db71-4a05-a018-9c382f33f55e/test-operator-logs-container/0.log" Mar 21 05:58:14 crc kubenswrapper[4775]: I0321 05:58:14.215786 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2pzqg_bccbefa9-966d-44b9-bd8f-bb566649b315/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:14 crc kubenswrapper[4775]: I0321 05:58:14.389214 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mfft7_268b27f0-a217-459e-9502-7b522ca6fe2c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:58:25 crc kubenswrapper[4775]: I0321 05:58:25.454365 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2a8305a2-5178-437f-a896-314b34fa595e/memcached/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.010255 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/util/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.182973 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/util/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.205330 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/pull/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.222885 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/pull/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.423711 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/extract/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.432835 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/util/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.433729 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_68694dd73c2979a60ce733190d43e876b2bb9f24864058ec18a9c495bfh5vbt_a8157e0e-bf83-4dbd-af86-d09d14e6e1b1/pull/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.653147 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-twhxx_94e1507b-be6c-4ecf-99e5-2bdcd2cc0cef/manager/0.log" Mar 21 05:58:43 crc kubenswrapper[4775]: I0321 05:58:43.863386 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-lxvtw_87dcea67-7f65-46a6-996b-3985bf1b5171/manager/0.log" Mar 21 05:58:44 crc kubenswrapper[4775]: I0321 05:58:44.066720 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-lzvsq_80932361-6406-48dd-9e4b-4e9c27813f68/manager/0.log" Mar 21 05:58:44 crc kubenswrapper[4775]: I0321 05:58:44.167554 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-m4gqz_0e83601c-758c-4f12-b745-bb68b0c4904f/manager/0.log" Mar 21 05:58:44 crc kubenswrapper[4775]: I0321 05:58:44.399152 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-7bdbc_01ed348d-8a8a-4717-ba0d-1944b3f1c081/manager/0.log" Mar 21 05:58:44 crc kubenswrapper[4775]: I0321 05:58:44.650358 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-9g8pv_f00c8c4b-874f-45ec-8a1a-e0834b3fc252/manager/0.log" Mar 21 05:58:44 crc kubenswrapper[4775]: I0321 05:58:44.966922 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-65f65cc49c-2mgp8_0a66456f-7860-4dc1-9c1c-0db69ddcc800/manager/0.log" Mar 21 05:58:44 crc kubenswrapper[4775]: I0321 05:58:44.973530 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-c7pjw_49a3c6c7-9e86-495d-8d1a-486d6bfbbbdd/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.178173 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-jq88h_20c78c73-daf5-481e-a4ac-62de73b5969e/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.216557 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dn22m_9fe71acc-7d35-4d4b-ac69-e193d3f39028/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.390835 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-hf688_ec38d53e-6fe4-41b5-8548-e49fadd9d6bf/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.460835 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-4tbvg_9bfe7d25-53ea-484d-a481-0ea04ee2b8a8/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.687319 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-tss7r_02b6af47-2c06-480b-a838-2d742efa1045/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.706058 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-67scw_745c79b1-1bcf-4c0f-82ee-a26cbba46d48/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.928335 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-rwq59_3035739a-202f-4794-bb4f-ae2342a96441/manager/0.log" Mar 21 05:58:45 crc kubenswrapper[4775]: I0321 05:58:45.973241 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-85fcfb8fbb-q2k4k_899f7d20-7208-419e-b0f8-36c7fbf2e841/operator/0.log" Mar 21 05:58:46 crc kubenswrapper[4775]: I0321 05:58:46.169913 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hvtkg_8c7426e8-8cec-4c84-8810-03a091d87cd9/registry-server/0.log" Mar 21 05:58:46 crc kubenswrapper[4775]: I0321 05:58:46.470387 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-ctm9h_6eeb04ad-7251-488c-bd52-b2f14f6fb68b/manager/0.log" Mar 21 05:58:46 crc kubenswrapper[4775]: I0321 05:58:46.560614 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-jj4pt_898b32c5-9f21-4fba-90c5-a333f36addf2/manager/0.log" Mar 21 05:58:46 crc kubenswrapper[4775]: I0321 05:58:46.774578 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-jvglf_95a44b12-e027-400d-b257-99f2012251d8/operator/0.log" Mar 21 05:58:46 crc kubenswrapper[4775]: I0321 05:58:46.920902 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-lqmgv_5968f1d9-f4e0-4c67-923e-2494e15c4088/manager/0.log" Mar 21 05:58:47 crc kubenswrapper[4775]: I0321 05:58:47.136067 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-9s82h_d1dbd80a-0782-4035-a263-b52a90f6ee0e/manager/0.log" Mar 21 05:58:47 crc kubenswrapper[4775]: I0321 05:58:47.358477 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-l59gx_9cac78ed-6325-4649-bb05-a1518ae692e9/manager/0.log" Mar 21 05:58:47 crc kubenswrapper[4775]: I0321 05:58:47.367804 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65746ff4dc-hg4rq_907f0cdf-2d87-4d09-97af-5591d061b4f6/manager/0.log" Mar 21 05:58:47 crc kubenswrapper[4775]: I0321 05:58:47.456822 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-gdrc5_3c9f18bd-def6-45ff-a92b-25c6f40d6bb5/manager/0.log" Mar 21 05:58:49 crc kubenswrapper[4775]: I0321 05:58:49.898016 4775 scope.go:117] "RemoveContainer" containerID="25394576ff3d5198021128077858159917ead74d0d3b8d2389ae8fb099e54be8" Mar 21 05:59:07 crc kubenswrapper[4775]: I0321 05:59:07.896964 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-d7xpn_e5224539-6d29-4bc3-9656-4665eb287e28/control-plane-machine-set-operator/0.log" Mar 21 05:59:07 crc kubenswrapper[4775]: I0321 05:59:07.989347 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5rnj6_fe3df1e1-4c22-48df-aaea-469c864f0310/kube-rbac-proxy/0.log" Mar 21 05:59:08 crc kubenswrapper[4775]: I0321 05:59:08.066736 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5rnj6_fe3df1e1-4c22-48df-aaea-469c864f0310/machine-api-operator/0.log" Mar 21 05:59:21 crc kubenswrapper[4775]: I0321 05:59:21.477748 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-c2lsw_4fdc8b75-b0a1-4ed3-9eee-6ee726dd0fbe/cert-manager-controller/0.log" Mar 21 05:59:21 crc kubenswrapper[4775]: I0321 05:59:21.617868 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ngckr_172b2006-3394-469a-be7f-1b66d020fd45/cert-manager-cainjector/0.log" Mar 21 05:59:21 crc kubenswrapper[4775]: I0321 05:59:21.734628 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ms9hv_670f734f-e215-441e-9b56-7251bc7f2484/cert-manager-webhook/0.log" Mar 21 05:59:32 crc kubenswrapper[4775]: I0321 05:59:32.483583 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:59:32 crc kubenswrapper[4775]: I0321 05:59:32.484314 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:59:33 crc kubenswrapper[4775]: I0321 05:59:33.891391 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-8x7x2_b0fbab95-1c88-40a3-8ccb-58bca74c8f3c/nmstate-console-plugin/0.log" Mar 21 05:59:34 crc kubenswrapper[4775]: I0321 05:59:34.177597 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6l74f_d03e5939-1625-4597-ad3b-9edf8e8075f5/nmstate-handler/0.log" Mar 21 05:59:34 crc kubenswrapper[4775]: I0321 05:59:34.256568 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-626kh_409422f2-717f-4f82-8dae-fe01dfda7083/kube-rbac-proxy/0.log" Mar 21 05:59:34 crc kubenswrapper[4775]: I0321 05:59:34.349381 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-626kh_409422f2-717f-4f82-8dae-fe01dfda7083/nmstate-metrics/0.log" Mar 21 05:59:34 crc kubenswrapper[4775]: I0321 05:59:34.419480 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-xljlp_f4a0a79a-5b67-44f8-9ef5-304530d5e764/nmstate-operator/0.log" Mar 21 05:59:34 crc kubenswrapper[4775]: I0321 05:59:34.545172 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rptn7_944b76e5-c8c5-4cba-9df2-9e9b87b540a8/nmstate-webhook/0.log" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.160143 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567880-mnpzk"] Mar 21 06:00:00 crc kubenswrapper[4775]: E0321 06:00:00.161368 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9f411-868c-4132-a586-c0f66e18edbc" containerName="oc" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.161387 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9f411-868c-4132-a586-c0f66e18edbc" containerName="oc" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.163798 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad9f411-868c-4132-a586-c0f66e18edbc" containerName="oc" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.164687 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-mnpzk" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.168471 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.168987 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.169021 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.176291 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl"] Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.177749 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.181398 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.187612 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl"] Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.192334 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.199182 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-mnpzk"] Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.313345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e331421-c1bf-4386-b8df-666fe188438c-secret-volume\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.313428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77jtw\" (UniqueName: \"kubernetes.io/projected/d8b58398-7316-48e6-91d4-184a3600d431-kube-api-access-77jtw\") pod \"auto-csr-approver-29567880-mnpzk\" (UID: \"d8b58398-7316-48e6-91d4-184a3600d431\") " pod="openshift-infra/auto-csr-approver-29567880-mnpzk" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.313558 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfm6q\" (UniqueName: \"kubernetes.io/projected/7e331421-c1bf-4386-b8df-666fe188438c-kube-api-access-xfm6q\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.313591 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e331421-c1bf-4386-b8df-666fe188438c-config-volume\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.414803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e331421-c1bf-4386-b8df-666fe188438c-config-volume\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.414907 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e331421-c1bf-4386-b8df-666fe188438c-secret-volume\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.414968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77jtw\" (UniqueName: \"kubernetes.io/projected/d8b58398-7316-48e6-91d4-184a3600d431-kube-api-access-77jtw\") pod \"auto-csr-approver-29567880-mnpzk\" (UID: \"d8b58398-7316-48e6-91d4-184a3600d431\") " pod="openshift-infra/auto-csr-approver-29567880-mnpzk" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.415140 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfm6q\" (UniqueName: \"kubernetes.io/projected/7e331421-c1bf-4386-b8df-666fe188438c-kube-api-access-xfm6q\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.416388 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e331421-c1bf-4386-b8df-666fe188438c-config-volume\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.434160 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e331421-c1bf-4386-b8df-666fe188438c-secret-volume\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.436987 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfm6q\" (UniqueName: \"kubernetes.io/projected/7e331421-c1bf-4386-b8df-666fe188438c-kube-api-access-xfm6q\") pod \"collect-profiles-29567880-szvrl\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.438942 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77jtw\" (UniqueName: \"kubernetes.io/projected/d8b58398-7316-48e6-91d4-184a3600d431-kube-api-access-77jtw\") pod \"auto-csr-approver-29567880-mnpzk\" (UID: \"d8b58398-7316-48e6-91d4-184a3600d431\") " pod="openshift-infra/auto-csr-approver-29567880-mnpzk" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.491585 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-mnpzk" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.511210 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:00 crc kubenswrapper[4775]: I0321 06:00:00.979692 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-mnpzk"] Mar 21 06:00:01 crc kubenswrapper[4775]: W0321 06:00:01.057647 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e331421_c1bf_4386_b8df_666fe188438c.slice/crio-204ed16aa9d0e149e4e3fe9954d731f73acf13815efc498c74d76339962afded WatchSource:0}: Error finding container 204ed16aa9d0e149e4e3fe9954d731f73acf13815efc498c74d76339962afded: Status 404 returned error can't find the container with id 204ed16aa9d0e149e4e3fe9954d731f73acf13815efc498c74d76339962afded Mar 21 06:00:01 crc kubenswrapper[4775]: I0321 06:00:01.058069 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl"] Mar 21 06:00:01 crc kubenswrapper[4775]: I0321 06:00:01.176562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" event={"ID":"7e331421-c1bf-4386-b8df-666fe188438c","Type":"ContainerStarted","Data":"204ed16aa9d0e149e4e3fe9954d731f73acf13815efc498c74d76339962afded"} Mar 21 06:00:01 crc kubenswrapper[4775]: I0321 06:00:01.177857 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567880-mnpzk" event={"ID":"d8b58398-7316-48e6-91d4-184a3600d431","Type":"ContainerStarted","Data":"7b89cbef9afa63d81b143d494e06f233d2505f8b9a3323a0aefa30926440691a"} Mar 21 06:00:01 crc kubenswrapper[4775]: E0321 06:00:01.667231 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e331421_c1bf_4386_b8df_666fe188438c.slice/crio-conmon-1ca8dee3c1c4ec0e490473322114b11216cc0630376f7af892bdd947cce0e6a0.scope\": RecentStats: unable to find data in memory cache]" Mar 21 06:00:02 crc kubenswrapper[4775]: I0321 06:00:02.207701 4775 generic.go:334] "Generic (PLEG): container finished" podID="7e331421-c1bf-4386-b8df-666fe188438c" containerID="1ca8dee3c1c4ec0e490473322114b11216cc0630376f7af892bdd947cce0e6a0" exitCode=0 Mar 21 06:00:02 crc kubenswrapper[4775]: I0321 06:00:02.207758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" event={"ID":"7e331421-c1bf-4386-b8df-666fe188438c","Type":"ContainerDied","Data":"1ca8dee3c1c4ec0e490473322114b11216cc0630376f7af892bdd947cce0e6a0"} Mar 21 06:00:02 crc kubenswrapper[4775]: I0321 06:00:02.482473 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 06:00:02 crc kubenswrapper[4775]: I0321 06:00:02.482531 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.537440 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.678646 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e331421-c1bf-4386-b8df-666fe188438c-config-volume\") pod \"7e331421-c1bf-4386-b8df-666fe188438c\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.678863 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e331421-c1bf-4386-b8df-666fe188438c-secret-volume\") pod \"7e331421-c1bf-4386-b8df-666fe188438c\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.678910 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfm6q\" (UniqueName: \"kubernetes.io/projected/7e331421-c1bf-4386-b8df-666fe188438c-kube-api-access-xfm6q\") pod \"7e331421-c1bf-4386-b8df-666fe188438c\" (UID: \"7e331421-c1bf-4386-b8df-666fe188438c\") " Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.680804 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e331421-c1bf-4386-b8df-666fe188438c-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e331421-c1bf-4386-b8df-666fe188438c" (UID: "7e331421-c1bf-4386-b8df-666fe188438c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.688279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e331421-c1bf-4386-b8df-666fe188438c-kube-api-access-xfm6q" (OuterVolumeSpecName: "kube-api-access-xfm6q") pod "7e331421-c1bf-4386-b8df-666fe188438c" (UID: "7e331421-c1bf-4386-b8df-666fe188438c"). InnerVolumeSpecName "kube-api-access-xfm6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.688950 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e331421-c1bf-4386-b8df-666fe188438c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e331421-c1bf-4386-b8df-666fe188438c" (UID: "7e331421-c1bf-4386-b8df-666fe188438c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.781275 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e331421-c1bf-4386-b8df-666fe188438c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.781574 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfm6q\" (UniqueName: \"kubernetes.io/projected/7e331421-c1bf-4386-b8df-666fe188438c-kube-api-access-xfm6q\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:03 crc kubenswrapper[4775]: I0321 06:00:03.781660 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e331421-c1bf-4386-b8df-666fe188438c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:04 crc kubenswrapper[4775]: I0321 06:00:04.224695 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" event={"ID":"7e331421-c1bf-4386-b8df-666fe188438c","Type":"ContainerDied","Data":"204ed16aa9d0e149e4e3fe9954d731f73acf13815efc498c74d76339962afded"} Mar 21 06:00:04 crc kubenswrapper[4775]: I0321 06:00:04.224990 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204ed16aa9d0e149e4e3fe9954d731f73acf13815efc498c74d76339962afded" Mar 21 06:00:04 crc kubenswrapper[4775]: I0321 06:00:04.224756 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567880-szvrl" Mar 21 06:00:04 crc kubenswrapper[4775]: I0321 06:00:04.624520 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx"] Mar 21 06:00:04 crc kubenswrapper[4775]: I0321 06:00:04.632596 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-47dnx"] Mar 21 06:00:04 crc kubenswrapper[4775]: I0321 06:00:04.923009 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qq976_ac0be1f3-95f0-40a4-9a94-c74cdaad9590/controller/0.log" Mar 21 06:00:04 crc kubenswrapper[4775]: I0321 06:00:04.960506 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qq976_ac0be1f3-95f0-40a4-9a94-c74cdaad9590/kube-rbac-proxy/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.112378 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.382699 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.393203 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.393207 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.407526 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.555472 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.586784 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.657766 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.675315 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bebab26-7b09-4e00-adee-fffa3c06f5ab" path="/var/lib/kubelet/pods/8bebab26-7b09-4e00-adee-fffa3c06f5ab/volumes" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.683925 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.889200 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-metrics/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.905708 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-frr-files/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.905913 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/cp-reloader/0.log" Mar 21 06:00:05 crc kubenswrapper[4775]: I0321 06:00:05.911939 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/controller/0.log" Mar 21 06:00:06 crc kubenswrapper[4775]: I0321 06:00:06.158307 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/kube-rbac-proxy-frr/0.log" Mar 21 06:00:06 crc kubenswrapper[4775]: I0321 06:00:06.158765 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/frr-metrics/0.log" Mar 21 06:00:06 crc kubenswrapper[4775]: I0321 06:00:06.189591 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/kube-rbac-proxy/0.log" Mar 21 06:00:06 crc kubenswrapper[4775]: I0321 06:00:06.430418 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/reloader/0.log" Mar 21 06:00:06 crc kubenswrapper[4775]: I0321 06:00:06.450698 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-7jsnx_e5808d9b-074a-4948-8283-fdfea77c63bc/frr-k8s-webhook-server/0.log" Mar 21 06:00:06 crc kubenswrapper[4775]: I0321 06:00:06.883502 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6695f56dbb-f6tqn_38d80f78-fa33-49b7-99c2-62d50d1c011b/manager/0.log" Mar 21 06:00:07 crc kubenswrapper[4775]: I0321 06:00:07.320473 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-559bfcf5c-qqsvn_8c52832d-1aee-4eac-b625-24110b985402/webhook-server/0.log" Mar 21 06:00:07 crc kubenswrapper[4775]: I0321 06:00:07.403151 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cpw6m_1d9349cc-e186-40c5-bb71-c176ff4f0a0d/kube-rbac-proxy/0.log" Mar 21 06:00:07 crc kubenswrapper[4775]: I0321 06:00:07.819307 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hmmgn_530a34fb-bf82-4a2e-afdd-ec646afdebcd/frr/0.log" Mar 21 06:00:07 crc kubenswrapper[4775]: I0321 06:00:07.979966 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cpw6m_1d9349cc-e186-40c5-bb71-c176ff4f0a0d/speaker/0.log" Mar 21 06:00:11 crc kubenswrapper[4775]: E0321 06:00:11.900171 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b58398_7316_48e6_91d4_184a3600d431.slice/crio-conmon-29077d3a4390d3dea3e0e351b0e11f03950cefc5d29477b332c73c6bd74c6c09.scope\": RecentStats: unable to find data in memory cache]" Mar 21 06:00:12 crc kubenswrapper[4775]: I0321 06:00:12.299193 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8b58398-7316-48e6-91d4-184a3600d431" containerID="29077d3a4390d3dea3e0e351b0e11f03950cefc5d29477b332c73c6bd74c6c09" exitCode=0 Mar 21 06:00:12 crc kubenswrapper[4775]: I0321 06:00:12.299239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567880-mnpzk" event={"ID":"d8b58398-7316-48e6-91d4-184a3600d431","Type":"ContainerDied","Data":"29077d3a4390d3dea3e0e351b0e11f03950cefc5d29477b332c73c6bd74c6c09"} Mar 21 06:00:13 crc kubenswrapper[4775]: I0321 06:00:13.701534 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-mnpzk" Mar 21 06:00:13 crc kubenswrapper[4775]: I0321 06:00:13.796898 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77jtw\" (UniqueName: \"kubernetes.io/projected/d8b58398-7316-48e6-91d4-184a3600d431-kube-api-access-77jtw\") pod \"d8b58398-7316-48e6-91d4-184a3600d431\" (UID: \"d8b58398-7316-48e6-91d4-184a3600d431\") " Mar 21 06:00:13 crc kubenswrapper[4775]: I0321 06:00:13.806894 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b58398-7316-48e6-91d4-184a3600d431-kube-api-access-77jtw" (OuterVolumeSpecName: "kube-api-access-77jtw") pod "d8b58398-7316-48e6-91d4-184a3600d431" (UID: "d8b58398-7316-48e6-91d4-184a3600d431"). InnerVolumeSpecName "kube-api-access-77jtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:00:13 crc kubenswrapper[4775]: I0321 06:00:13.899146 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77jtw\" (UniqueName: \"kubernetes.io/projected/d8b58398-7316-48e6-91d4-184a3600d431-kube-api-access-77jtw\") on node \"crc\" DevicePath \"\"" Mar 21 06:00:14 crc kubenswrapper[4775]: I0321 06:00:14.316460 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567880-mnpzk" event={"ID":"d8b58398-7316-48e6-91d4-184a3600d431","Type":"ContainerDied","Data":"7b89cbef9afa63d81b143d494e06f233d2505f8b9a3323a0aefa30926440691a"} Mar 21 06:00:14 crc kubenswrapper[4775]: I0321 06:00:14.316766 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b89cbef9afa63d81b143d494e06f233d2505f8b9a3323a0aefa30926440691a" Mar 21 06:00:14 crc kubenswrapper[4775]: I0321 06:00:14.316543 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567880-mnpzk" Mar 21 06:00:14 crc kubenswrapper[4775]: I0321 06:00:14.783864 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-8f686"] Mar 21 06:00:14 crc kubenswrapper[4775]: I0321 06:00:14.793956 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567874-8f686"] Mar 21 06:00:15 crc kubenswrapper[4775]: I0321 06:00:15.674964 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ebee46-23d7-4d38-946e-10f7a3238243" path="/var/lib/kubelet/pods/a6ebee46-23d7-4d38-946e-10f7a3238243/volumes" Mar 21 06:00:22 crc kubenswrapper[4775]: I0321 06:00:22.497518 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/util/0.log" Mar 21 06:00:22 crc kubenswrapper[4775]: I0321 06:00:22.753852 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/util/0.log" Mar 21 06:00:22 crc kubenswrapper[4775]: I0321 06:00:22.754587 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/pull/0.log" Mar 21 06:00:22 crc kubenswrapper[4775]: I0321 06:00:22.760733 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/pull/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.025219 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/pull/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.034635 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/extract/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.046935 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xc6qp_b2ca57eb-edac-4db7-bfa4-3198d80daf99/util/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.233641 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/util/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.394622 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/pull/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.441574 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/pull/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.443518 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/util/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.619832 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/extract/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.757009 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/pull/0.log" Mar 21 06:00:23 crc kubenswrapper[4775]: I0321 06:00:23.928215 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1ff62s_6252c43a-d149-46f7-ac6c-263c34980fe2/util/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.030401 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-utilities/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.226757 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-content/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.227318 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-utilities/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.232559 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-content/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.479104 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-utilities/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.484931 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/extract-content/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.735457 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-utilities/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.875310 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-content/0.log" Mar 21 06:00:24 crc kubenswrapper[4775]: I0321 06:00:24.949781 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-utilities/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.056848 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-content/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.165907 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-77cp9_08558c65-4599-4c64-bb0f-f18f94cecdec/registry-server/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.256409 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-utilities/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.338461 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/extract-content/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.469679 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z7fcx_59fec450-4b61-4a15-b1b5-b47dedd649a0/marketplace-operator/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.740752 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-utilities/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.962352 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-x59zn_e8513867-3729-4d93-b8ca-45ecb69c50e6/registry-server/0.log" Mar 21 06:00:25 crc kubenswrapper[4775]: I0321 06:00:25.983088 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-content/0.log" Mar 21 06:00:26 crc kubenswrapper[4775]: I0321 06:00:26.029972 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-content/0.log" Mar 21 06:00:26 crc kubenswrapper[4775]: I0321 06:00:26.051534 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-utilities/0.log" Mar 21 06:00:26 crc kubenswrapper[4775]: I0321 06:00:26.255573 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-content/0.log" Mar 21 06:00:26 crc kubenswrapper[4775]: I0321 06:00:26.268670 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/extract-utilities/0.log" Mar 21 06:00:26 crc kubenswrapper[4775]: I0321 06:00:26.441500 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h4wmk_4a9e7d52-67b3-4a38-a978-6566b1c7870a/registry-server/0.log" Mar 21 06:00:26 crc kubenswrapper[4775]: I0321 06:00:26.994184 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-utilities/0.log" Mar 21 06:00:27 crc kubenswrapper[4775]: I0321 06:00:27.153822 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-content/0.log" Mar 21 06:00:27 crc kubenswrapper[4775]: I0321 06:00:27.183534 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-content/0.log" Mar 21 06:00:27 crc kubenswrapper[4775]: I0321 06:00:27.188836 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-utilities/0.log" Mar 21 06:00:27 crc kubenswrapper[4775]: I0321 06:00:27.341708 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-utilities/0.log" Mar 21 06:00:27 crc kubenswrapper[4775]: I0321 06:00:27.408270 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/extract-content/0.log" Mar 21 06:00:27 crc kubenswrapper[4775]: I0321 06:00:27.878160 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gslhk_373fa4a4-80b8-4b32-a9ee-b272604d3adc/registry-server/0.log" Mar 21 06:00:32 crc kubenswrapper[4775]: I0321 06:00:32.482681 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qc7hn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 06:00:32 crc kubenswrapper[4775]: I0321 06:00:32.483361 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 06:00:32 crc kubenswrapper[4775]: I0321 06:00:32.483414 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" Mar 21 06:00:32 crc kubenswrapper[4775]: I0321 06:00:32.484246 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226"} pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 06:00:32 crc kubenswrapper[4775]: I0321 06:00:32.484306 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerName="machine-config-daemon" containerID="cri-o://9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" gracePeriod=600 Mar 21 06:00:32 crc kubenswrapper[4775]: E0321 06:00:32.633427 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:00:33 crc kubenswrapper[4775]: I0321 06:00:33.518052 4775 generic.go:334] "Generic (PLEG): container finished" podID="cffcf487-ef41-4395-81eb-e5e6358f4a32" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" exitCode=0 Mar 21 06:00:33 crc kubenswrapper[4775]: I0321 06:00:33.518114 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerDied","Data":"9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226"} Mar 21 06:00:33 crc kubenswrapper[4775]: I0321 06:00:33.518514 4775 scope.go:117] "RemoveContainer" containerID="fb650740af569c16423f24bd3abda67b78d6d33afa7ae7aa33a8a75f71e85f1c" Mar 21 06:00:33 crc kubenswrapper[4775]: I0321 06:00:33.519615 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:00:33 crc kubenswrapper[4775]: E0321 06:00:33.520525 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:00:47 crc kubenswrapper[4775]: I0321 06:00:47.679668 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:00:47 crc kubenswrapper[4775]: E0321 06:00:47.682287 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:00:50 crc kubenswrapper[4775]: I0321 06:00:50.277895 4775 scope.go:117] "RemoveContainer" containerID="b68cd9cb06e02040464cbd016f2e9b85be0a632889c514eaecf91f324598c72a" Mar 21 06:00:50 crc kubenswrapper[4775]: I0321 06:00:50.352653 4775 scope.go:117] "RemoveContainer" containerID="c3748d6274f12563ae1762dcd4c9c668e2bf7b8aecb55d574f50d517af661d16" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.158709 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567881-7fbgc"] Mar 21 06:01:00 crc kubenswrapper[4775]: E0321 06:01:00.159878 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b58398-7316-48e6-91d4-184a3600d431" containerName="oc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.159896 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b58398-7316-48e6-91d4-184a3600d431" containerName="oc" Mar 21 06:01:00 crc kubenswrapper[4775]: E0321 06:01:00.159950 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e331421-c1bf-4386-b8df-666fe188438c" containerName="collect-profiles" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.159958 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e331421-c1bf-4386-b8df-666fe188438c" containerName="collect-profiles" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.160200 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b58398-7316-48e6-91d4-184a3600d431" containerName="oc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.160223 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e331421-c1bf-4386-b8df-666fe188438c" containerName="collect-profiles" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.161003 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.191576 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567881-7fbgc"] Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.262848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-config-data\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.262962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-fernet-keys\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.263165 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-combined-ca-bundle\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.263244 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj8wz\" (UniqueName: \"kubernetes.io/projected/b00fa40d-721d-46a9-b871-d280a22f9f06-kube-api-access-nj8wz\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.365023 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-combined-ca-bundle\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.365992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj8wz\" (UniqueName: \"kubernetes.io/projected/b00fa40d-721d-46a9-b871-d280a22f9f06-kube-api-access-nj8wz\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.366071 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-config-data\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.366097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-fernet-keys\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.371666 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-fernet-keys\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.371667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-config-data\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.386965 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj8wz\" (UniqueName: \"kubernetes.io/projected/b00fa40d-721d-46a9-b871-d280a22f9f06-kube-api-access-nj8wz\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.393805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-combined-ca-bundle\") pod \"keystone-cron-29567881-7fbgc\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.482900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:00 crc kubenswrapper[4775]: I0321 06:01:00.662931 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:01:00 crc kubenswrapper[4775]: E0321 06:01:00.663685 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:01:01 crc kubenswrapper[4775]: I0321 06:01:01.137150 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567881-7fbgc"] Mar 21 06:01:01 crc kubenswrapper[4775]: I0321 06:01:01.820794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-7fbgc" event={"ID":"b00fa40d-721d-46a9-b871-d280a22f9f06","Type":"ContainerStarted","Data":"33e7299b76fea0dff44ff3c0e4495d43b700f349cc86461f84700fb6cb3a6026"} Mar 21 06:01:01 crc kubenswrapper[4775]: I0321 06:01:01.821180 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-7fbgc" event={"ID":"b00fa40d-721d-46a9-b871-d280a22f9f06","Type":"ContainerStarted","Data":"9571a639ded309e03008689489bd3e624e56bd15417b47a2fcd34328d78ae2f1"} Mar 21 06:01:01 crc kubenswrapper[4775]: I0321 06:01:01.846031 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567881-7fbgc" podStartSLOduration=1.8460081019999999 podStartE2EDuration="1.846008102s" podCreationTimestamp="2026-03-21 06:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 06:01:01.842453361 +0000 UTC m=+4414.818916985" watchObservedRunningTime="2026-03-21 06:01:01.846008102 +0000 UTC m=+4414.822471726" Mar 21 06:01:04 crc kubenswrapper[4775]: I0321 06:01:04.847634 4775 generic.go:334] "Generic (PLEG): container finished" podID="b00fa40d-721d-46a9-b871-d280a22f9f06" containerID="33e7299b76fea0dff44ff3c0e4495d43b700f349cc86461f84700fb6cb3a6026" exitCode=0 Mar 21 06:01:04 crc kubenswrapper[4775]: I0321 06:01:04.847874 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-7fbgc" event={"ID":"b00fa40d-721d-46a9-b871-d280a22f9f06","Type":"ContainerDied","Data":"33e7299b76fea0dff44ff3c0e4495d43b700f349cc86461f84700fb6cb3a6026"} Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.245422 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.393713 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj8wz\" (UniqueName: \"kubernetes.io/projected/b00fa40d-721d-46a9-b871-d280a22f9f06-kube-api-access-nj8wz\") pod \"b00fa40d-721d-46a9-b871-d280a22f9f06\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.393788 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-config-data\") pod \"b00fa40d-721d-46a9-b871-d280a22f9f06\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.394660 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-combined-ca-bundle\") pod \"b00fa40d-721d-46a9-b871-d280a22f9f06\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.394783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-fernet-keys\") pod \"b00fa40d-721d-46a9-b871-d280a22f9f06\" (UID: \"b00fa40d-721d-46a9-b871-d280a22f9f06\") " Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.400805 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b00fa40d-721d-46a9-b871-d280a22f9f06" (UID: "b00fa40d-721d-46a9-b871-d280a22f9f06"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.422196 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00fa40d-721d-46a9-b871-d280a22f9f06-kube-api-access-nj8wz" (OuterVolumeSpecName: "kube-api-access-nj8wz") pod "b00fa40d-721d-46a9-b871-d280a22f9f06" (UID: "b00fa40d-721d-46a9-b871-d280a22f9f06"). InnerVolumeSpecName "kube-api-access-nj8wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.426727 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b00fa40d-721d-46a9-b871-d280a22f9f06" (UID: "b00fa40d-721d-46a9-b871-d280a22f9f06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.455031 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-config-data" (OuterVolumeSpecName: "config-data") pod "b00fa40d-721d-46a9-b871-d280a22f9f06" (UID: "b00fa40d-721d-46a9-b871-d280a22f9f06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.497650 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj8wz\" (UniqueName: \"kubernetes.io/projected/b00fa40d-721d-46a9-b871-d280a22f9f06-kube-api-access-nj8wz\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.497692 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.497705 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.497715 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b00fa40d-721d-46a9-b871-d280a22f9f06-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.872341 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567881-7fbgc" event={"ID":"b00fa40d-721d-46a9-b871-d280a22f9f06","Type":"ContainerDied","Data":"9571a639ded309e03008689489bd3e624e56bd15417b47a2fcd34328d78ae2f1"} Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.872410 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9571a639ded309e03008689489bd3e624e56bd15417b47a2fcd34328d78ae2f1" Mar 21 06:01:06 crc kubenswrapper[4775]: I0321 06:01:06.872527 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567881-7fbgc" Mar 21 06:01:14 crc kubenswrapper[4775]: I0321 06:01:14.662056 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:01:14 crc kubenswrapper[4775]: E0321 06:01:14.662878 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:01:28 crc kubenswrapper[4775]: I0321 06:01:28.661528 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:01:28 crc kubenswrapper[4775]: E0321 06:01:28.662483 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:01:43 crc kubenswrapper[4775]: I0321 06:01:43.662597 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:01:43 crc kubenswrapper[4775]: E0321 06:01:43.663415 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:01:56 crc kubenswrapper[4775]: I0321 06:01:56.662446 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:01:56 crc kubenswrapper[4775]: E0321 06:01:56.663120 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.153670 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567882-9lbmt"] Mar 21 06:02:00 crc kubenswrapper[4775]: E0321 06:02:00.154543 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00fa40d-721d-46a9-b871-d280a22f9f06" containerName="keystone-cron" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.154560 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00fa40d-721d-46a9-b871-d280a22f9f06" containerName="keystone-cron" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.154783 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00fa40d-721d-46a9-b871-d280a22f9f06" containerName="keystone-cron" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.155666 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-9lbmt" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.160225 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.160262 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.162476 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.165275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567882-9lbmt"] Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.230191 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gz9\" (UniqueName: \"kubernetes.io/projected/be3c940f-819c-4433-93d3-9e374377664b-kube-api-access-s7gz9\") pod \"auto-csr-approver-29567882-9lbmt\" (UID: \"be3c940f-819c-4433-93d3-9e374377664b\") " pod="openshift-infra/auto-csr-approver-29567882-9lbmt" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.332607 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gz9\" (UniqueName: \"kubernetes.io/projected/be3c940f-819c-4433-93d3-9e374377664b-kube-api-access-s7gz9\") pod \"auto-csr-approver-29567882-9lbmt\" (UID: \"be3c940f-819c-4433-93d3-9e374377664b\") " pod="openshift-infra/auto-csr-approver-29567882-9lbmt" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.353770 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gz9\" (UniqueName: \"kubernetes.io/projected/be3c940f-819c-4433-93d3-9e374377664b-kube-api-access-s7gz9\") pod \"auto-csr-approver-29567882-9lbmt\" (UID: \"be3c940f-819c-4433-93d3-9e374377664b\") " pod="openshift-infra/auto-csr-approver-29567882-9lbmt" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.476326 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-9lbmt" Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.921049 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567882-9lbmt"] Mar 21 06:02:00 crc kubenswrapper[4775]: I0321 06:02:00.928023 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 06:02:01 crc kubenswrapper[4775]: I0321 06:02:01.516413 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567882-9lbmt" event={"ID":"be3c940f-819c-4433-93d3-9e374377664b","Type":"ContainerStarted","Data":"9196cc162f91871c5b955f0e73de9b175ceeba96a5016e153c7e7124c422f85b"} Mar 21 06:02:02 crc kubenswrapper[4775]: I0321 06:02:02.535283 4775 generic.go:334] "Generic (PLEG): container finished" podID="be3c940f-819c-4433-93d3-9e374377664b" containerID="abf324484875b628710ec29da86cbdd700864e96fb28df41c8b19dfdbc513b6e" exitCode=0 Mar 21 06:02:02 crc kubenswrapper[4775]: I0321 06:02:02.535585 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567882-9lbmt" event={"ID":"be3c940f-819c-4433-93d3-9e374377664b","Type":"ContainerDied","Data":"abf324484875b628710ec29da86cbdd700864e96fb28df41c8b19dfdbc513b6e"} Mar 21 06:02:04 crc kubenswrapper[4775]: I0321 06:02:04.401604 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-9lbmt" Mar 21 06:02:04 crc kubenswrapper[4775]: I0321 06:02:04.516817 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7gz9\" (UniqueName: \"kubernetes.io/projected/be3c940f-819c-4433-93d3-9e374377664b-kube-api-access-s7gz9\") pod \"be3c940f-819c-4433-93d3-9e374377664b\" (UID: \"be3c940f-819c-4433-93d3-9e374377664b\") " Mar 21 06:02:04 crc kubenswrapper[4775]: I0321 06:02:04.526991 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3c940f-819c-4433-93d3-9e374377664b-kube-api-access-s7gz9" (OuterVolumeSpecName: "kube-api-access-s7gz9") pod "be3c940f-819c-4433-93d3-9e374377664b" (UID: "be3c940f-819c-4433-93d3-9e374377664b"). InnerVolumeSpecName "kube-api-access-s7gz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:02:04 crc kubenswrapper[4775]: I0321 06:02:04.556885 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567882-9lbmt" event={"ID":"be3c940f-819c-4433-93d3-9e374377664b","Type":"ContainerDied","Data":"9196cc162f91871c5b955f0e73de9b175ceeba96a5016e153c7e7124c422f85b"} Mar 21 06:02:04 crc kubenswrapper[4775]: I0321 06:02:04.556924 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9196cc162f91871c5b955f0e73de9b175ceeba96a5016e153c7e7124c422f85b" Mar 21 06:02:04 crc kubenswrapper[4775]: I0321 06:02:04.557229 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567882-9lbmt" Mar 21 06:02:04 crc kubenswrapper[4775]: I0321 06:02:04.619017 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7gz9\" (UniqueName: \"kubernetes.io/projected/be3c940f-819c-4433-93d3-9e374377664b-kube-api-access-s7gz9\") on node \"crc\" DevicePath \"\"" Mar 21 06:02:04 crc kubenswrapper[4775]: E0321 06:02:04.718247 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe3c940f_819c_4433_93d3_9e374377664b.slice\": RecentStats: unable to find data in memory cache]" Mar 21 06:02:05 crc kubenswrapper[4775]: I0321 06:02:05.472050 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-cl4xc"] Mar 21 06:02:05 crc kubenswrapper[4775]: I0321 06:02:05.480392 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567876-cl4xc"] Mar 21 06:02:05 crc kubenswrapper[4775]: I0321 06:02:05.673181 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bcfe47-3b2b-4864-8659-69053e25c0a7" path="/var/lib/kubelet/pods/77bcfe47-3b2b-4864-8659-69053e25c0a7/volumes" Mar 21 06:02:12 crc kubenswrapper[4775]: I0321 06:02:12.662178 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:02:12 crc kubenswrapper[4775]: E0321 06:02:12.663338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:02:23 crc kubenswrapper[4775]: I0321 06:02:23.750780 4775 generic.go:334] "Generic (PLEG): container finished" podID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerID="9e98fbe03eab0e6c4baf6973b4872261d6f0c83d76001f189bcb6eab76312381" exitCode=0 Mar 21 06:02:23 crc kubenswrapper[4775]: I0321 06:02:23.750917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2gd5d/must-gather-fmnth" event={"ID":"f8e9d6cc-a02d-4747-8f7c-f439b3227b11","Type":"ContainerDied","Data":"9e98fbe03eab0e6c4baf6973b4872261d6f0c83d76001f189bcb6eab76312381"} Mar 21 06:02:23 crc kubenswrapper[4775]: I0321 06:02:23.753453 4775 scope.go:117] "RemoveContainer" containerID="9e98fbe03eab0e6c4baf6973b4872261d6f0c83d76001f189bcb6eab76312381" Mar 21 06:02:24 crc kubenswrapper[4775]: I0321 06:02:24.366948 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2gd5d_must-gather-fmnth_f8e9d6cc-a02d-4747-8f7c-f439b3227b11/gather/0.log" Mar 21 06:02:27 crc kubenswrapper[4775]: I0321 06:02:27.670881 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:02:27 crc kubenswrapper[4775]: E0321 06:02:27.673497 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.345046 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2gd5d/must-gather-fmnth"] Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.346200 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2gd5d/must-gather-fmnth" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerName="copy" containerID="cri-o://7c0d546b19a6c80ff65ea248d24b3db62171d0f126c07845976d4d5ff49b7d79" gracePeriod=2 Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.355626 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2gd5d/must-gather-fmnth"] Mar 21 06:02:35 crc kubenswrapper[4775]: E0321 06:02:35.432266 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e9d6cc_a02d_4747_8f7c_f439b3227b11.slice/crio-7c0d546b19a6c80ff65ea248d24b3db62171d0f126c07845976d4d5ff49b7d79.scope\": RecentStats: unable to find data in memory cache]" Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.890922 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2gd5d_must-gather-fmnth_f8e9d6cc-a02d-4747-8f7c-f439b3227b11/copy/0.log" Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.892104 4775 generic.go:334] "Generic (PLEG): container finished" podID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerID="7c0d546b19a6c80ff65ea248d24b3db62171d0f126c07845976d4d5ff49b7d79" exitCode=143 Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.892171 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c4f80f0df39580d8656dd019ad5ecda89f94b6664de1b8c9c62c10087b36fe2" Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.928264 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2gd5d_must-gather-fmnth_f8e9d6cc-a02d-4747-8f7c-f439b3227b11/copy/0.log" Mar 21 06:02:35 crc kubenswrapper[4775]: I0321 06:02:35.928765 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 06:02:36 crc kubenswrapper[4775]: I0321 06:02:36.015755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5jqm\" (UniqueName: \"kubernetes.io/projected/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-kube-api-access-k5jqm\") pod \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " Mar 21 06:02:36 crc kubenswrapper[4775]: I0321 06:02:36.015904 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-must-gather-output\") pod \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\" (UID: \"f8e9d6cc-a02d-4747-8f7c-f439b3227b11\") " Mar 21 06:02:36 crc kubenswrapper[4775]: I0321 06:02:36.020876 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-kube-api-access-k5jqm" (OuterVolumeSpecName: "kube-api-access-k5jqm") pod "f8e9d6cc-a02d-4747-8f7c-f439b3227b11" (UID: "f8e9d6cc-a02d-4747-8f7c-f439b3227b11"). InnerVolumeSpecName "kube-api-access-k5jqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:02:36 crc kubenswrapper[4775]: I0321 06:02:36.119094 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5jqm\" (UniqueName: \"kubernetes.io/projected/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-kube-api-access-k5jqm\") on node \"crc\" DevicePath \"\"" Mar 21 06:02:36 crc kubenswrapper[4775]: I0321 06:02:36.186768 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f8e9d6cc-a02d-4747-8f7c-f439b3227b11" (UID: "f8e9d6cc-a02d-4747-8f7c-f439b3227b11"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:02:36 crc kubenswrapper[4775]: I0321 06:02:36.223472 4775 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f8e9d6cc-a02d-4747-8f7c-f439b3227b11-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 06:02:36 crc kubenswrapper[4775]: I0321 06:02:36.900170 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2gd5d/must-gather-fmnth" Mar 21 06:02:37 crc kubenswrapper[4775]: I0321 06:02:37.673425 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" path="/var/lib/kubelet/pods/f8e9d6cc-a02d-4747-8f7c-f439b3227b11/volumes" Mar 21 06:02:38 crc kubenswrapper[4775]: I0321 06:02:38.661355 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:02:38 crc kubenswrapper[4775]: E0321 06:02:38.661922 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.640106 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-84fzc"] Mar 21 06:02:42 crc kubenswrapper[4775]: E0321 06:02:42.641001 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3c940f-819c-4433-93d3-9e374377664b" containerName="oc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.641019 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3c940f-819c-4433-93d3-9e374377664b" containerName="oc" Mar 21 06:02:42 crc kubenswrapper[4775]: E0321 06:02:42.641035 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerName="gather" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.641043 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerName="gather" Mar 21 06:02:42 crc kubenswrapper[4775]: E0321 06:02:42.641070 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerName="copy" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.641078 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerName="copy" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.641340 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerName="copy" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.641363 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3c940f-819c-4433-93d3-9e374377664b" containerName="oc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.641378 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e9d6cc-a02d-4747-8f7c-f439b3227b11" containerName="gather" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.642982 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.658086 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84fzc"] Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.751349 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-catalog-content\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.751449 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-utilities\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.751553 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv4zx\" (UniqueName: \"kubernetes.io/projected/69163dff-f119-4580-8b11-b76aa49d1a3e-kube-api-access-dv4zx\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.853778 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-catalog-content\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.853913 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-utilities\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.854075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv4zx\" (UniqueName: \"kubernetes.io/projected/69163dff-f119-4580-8b11-b76aa49d1a3e-kube-api-access-dv4zx\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.854395 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-catalog-content\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.854636 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-utilities\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.892229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv4zx\" (UniqueName: \"kubernetes.io/projected/69163dff-f119-4580-8b11-b76aa49d1a3e-kube-api-access-dv4zx\") pod \"redhat-operators-84fzc\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:42 crc kubenswrapper[4775]: I0321 06:02:42.970041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:43 crc kubenswrapper[4775]: I0321 06:02:43.466278 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84fzc"] Mar 21 06:02:43 crc kubenswrapper[4775]: I0321 06:02:43.966225 4775 generic.go:334] "Generic (PLEG): container finished" podID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerID="544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be" exitCode=0 Mar 21 06:02:43 crc kubenswrapper[4775]: I0321 06:02:43.966286 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84fzc" event={"ID":"69163dff-f119-4580-8b11-b76aa49d1a3e","Type":"ContainerDied","Data":"544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be"} Mar 21 06:02:43 crc kubenswrapper[4775]: I0321 06:02:43.966559 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84fzc" event={"ID":"69163dff-f119-4580-8b11-b76aa49d1a3e","Type":"ContainerStarted","Data":"9cd8085f69bacb1645282cffb1f31270cf94d93a58a88bcbb1dba70a7971766f"} Mar 21 06:02:45 crc kubenswrapper[4775]: I0321 06:02:45.991530 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84fzc" event={"ID":"69163dff-f119-4580-8b11-b76aa49d1a3e","Type":"ContainerStarted","Data":"daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413"} Mar 21 06:02:50 crc kubenswrapper[4775]: I0321 06:02:50.030928 4775 generic.go:334] "Generic (PLEG): container finished" podID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerID="daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413" exitCode=0 Mar 21 06:02:50 crc kubenswrapper[4775]: I0321 06:02:50.030984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84fzc" event={"ID":"69163dff-f119-4580-8b11-b76aa49d1a3e","Type":"ContainerDied","Data":"daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413"} Mar 21 06:02:50 crc kubenswrapper[4775]: I0321 06:02:50.485481 4775 scope.go:117] "RemoveContainer" containerID="ff881789564104cd73abc465716a89e9559dbe987cab602aecfdcfd1bd6865c8" Mar 21 06:02:50 crc kubenswrapper[4775]: I0321 06:02:50.537212 4775 scope.go:117] "RemoveContainer" containerID="7c0d546b19a6c80ff65ea248d24b3db62171d0f126c07845976d4d5ff49b7d79" Mar 21 06:02:50 crc kubenswrapper[4775]: I0321 06:02:50.564597 4775 scope.go:117] "RemoveContainer" containerID="9e98fbe03eab0e6c4baf6973b4872261d6f0c83d76001f189bcb6eab76312381" Mar 21 06:02:51 crc kubenswrapper[4775]: I0321 06:02:51.044909 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84fzc" event={"ID":"69163dff-f119-4580-8b11-b76aa49d1a3e","Type":"ContainerStarted","Data":"ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41"} Mar 21 06:02:52 crc kubenswrapper[4775]: I0321 06:02:52.074676 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-84fzc" podStartSLOduration=3.369285068 podStartE2EDuration="10.074654767s" podCreationTimestamp="2026-03-21 06:02:42 +0000 UTC" firstStartedPulling="2026-03-21 06:02:43.96844029 +0000 UTC m=+4516.944903914" lastFinishedPulling="2026-03-21 06:02:50.673809989 +0000 UTC m=+4523.650273613" observedRunningTime="2026-03-21 06:02:52.0705268 +0000 UTC m=+4525.046990464" watchObservedRunningTime="2026-03-21 06:02:52.074654767 +0000 UTC m=+4525.051118401" Mar 21 06:02:52 crc kubenswrapper[4775]: I0321 06:02:52.661862 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:02:52 crc kubenswrapper[4775]: E0321 06:02:52.662169 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:02:52 crc kubenswrapper[4775]: I0321 06:02:52.970668 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:52 crc kubenswrapper[4775]: I0321 06:02:52.970733 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:02:54 crc kubenswrapper[4775]: I0321 06:02:54.016739 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-84fzc" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="registry-server" probeResult="failure" output=< Mar 21 06:02:54 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Mar 21 06:02:54 crc kubenswrapper[4775]: > Mar 21 06:03:03 crc kubenswrapper[4775]: I0321 06:03:03.015967 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:03:03 crc kubenswrapper[4775]: I0321 06:03:03.073839 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:03:03 crc kubenswrapper[4775]: I0321 06:03:03.254904 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84fzc"] Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.158794 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-84fzc" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="registry-server" containerID="cri-o://ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41" gracePeriod=2 Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.667055 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.782478 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-utilities\") pod \"69163dff-f119-4580-8b11-b76aa49d1a3e\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.782717 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv4zx\" (UniqueName: \"kubernetes.io/projected/69163dff-f119-4580-8b11-b76aa49d1a3e-kube-api-access-dv4zx\") pod \"69163dff-f119-4580-8b11-b76aa49d1a3e\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.782815 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-catalog-content\") pod \"69163dff-f119-4580-8b11-b76aa49d1a3e\" (UID: \"69163dff-f119-4580-8b11-b76aa49d1a3e\") " Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.783586 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-utilities" (OuterVolumeSpecName: "utilities") pod "69163dff-f119-4580-8b11-b76aa49d1a3e" (UID: "69163dff-f119-4580-8b11-b76aa49d1a3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.790797 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69163dff-f119-4580-8b11-b76aa49d1a3e-kube-api-access-dv4zx" (OuterVolumeSpecName: "kube-api-access-dv4zx") pod "69163dff-f119-4580-8b11-b76aa49d1a3e" (UID: "69163dff-f119-4580-8b11-b76aa49d1a3e"). InnerVolumeSpecName "kube-api-access-dv4zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.884969 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.885002 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv4zx\" (UniqueName: \"kubernetes.io/projected/69163dff-f119-4580-8b11-b76aa49d1a3e-kube-api-access-dv4zx\") on node \"crc\" DevicePath \"\"" Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.936930 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69163dff-f119-4580-8b11-b76aa49d1a3e" (UID: "69163dff-f119-4580-8b11-b76aa49d1a3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:03:04 crc kubenswrapper[4775]: I0321 06:03:04.986801 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69163dff-f119-4580-8b11-b76aa49d1a3e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.169058 4775 generic.go:334] "Generic (PLEG): container finished" podID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerID="ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41" exitCode=0 Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.169105 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84fzc" event={"ID":"69163dff-f119-4580-8b11-b76aa49d1a3e","Type":"ContainerDied","Data":"ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41"} Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.169154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84fzc" event={"ID":"69163dff-f119-4580-8b11-b76aa49d1a3e","Type":"ContainerDied","Data":"9cd8085f69bacb1645282cffb1f31270cf94d93a58a88bcbb1dba70a7971766f"} Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.169177 4775 scope.go:117] "RemoveContainer" containerID="ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.169199 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84fzc" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.190343 4775 scope.go:117] "RemoveContainer" containerID="daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.218950 4775 scope.go:117] "RemoveContainer" containerID="544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.219726 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84fzc"] Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.230161 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-84fzc"] Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.265392 4775 scope.go:117] "RemoveContainer" containerID="ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41" Mar 21 06:03:05 crc kubenswrapper[4775]: E0321 06:03:05.266060 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41\": container with ID starting with ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41 not found: ID does not exist" containerID="ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.266107 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41"} err="failed to get container status \"ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41\": rpc error: code = NotFound desc = could not find container \"ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41\": container with ID starting with ba6cac641ea0a1f05579a2bf5c076d00e13cb75f839708ed34b79a2988888e41 not found: ID does not exist" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.266158 4775 scope.go:117] "RemoveContainer" containerID="daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413" Mar 21 06:03:05 crc kubenswrapper[4775]: E0321 06:03:05.266525 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413\": container with ID starting with daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413 not found: ID does not exist" containerID="daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.266546 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413"} err="failed to get container status \"daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413\": rpc error: code = NotFound desc = could not find container \"daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413\": container with ID starting with daf99bcc88c5e92291e19ed807a395df95686881569148442d45d8d404d6c413 not found: ID does not exist" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.266559 4775 scope.go:117] "RemoveContainer" containerID="544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be" Mar 21 06:03:05 crc kubenswrapper[4775]: E0321 06:03:05.266920 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be\": container with ID starting with 544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be not found: ID does not exist" containerID="544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.266941 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be"} err="failed to get container status \"544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be\": rpc error: code = NotFound desc = could not find container \"544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be\": container with ID starting with 544948cf71c091c3b9668c3d1c57183b5e542b204afa23a4c531c026b2ee24be not found: ID does not exist" Mar 21 06:03:05 crc kubenswrapper[4775]: I0321 06:03:05.671790 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" path="/var/lib/kubelet/pods/69163dff-f119-4580-8b11-b76aa49d1a3e/volumes" Mar 21 06:03:07 crc kubenswrapper[4775]: I0321 06:03:07.670892 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:03:07 crc kubenswrapper[4775]: E0321 06:03:07.671524 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:03:22 crc kubenswrapper[4775]: I0321 06:03:22.661944 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:03:22 crc kubenswrapper[4775]: E0321 06:03:22.662881 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:03:35 crc kubenswrapper[4775]: I0321 06:03:35.662135 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:03:35 crc kubenswrapper[4775]: E0321 06:03:35.663003 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:03:47 crc kubenswrapper[4775]: I0321 06:03:47.692504 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:03:47 crc kubenswrapper[4775]: E0321 06:03:47.693240 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:03:58 crc kubenswrapper[4775]: I0321 06:03:58.661641 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:03:58 crc kubenswrapper[4775]: E0321 06:03:58.662502 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.138875 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567884-tsqsf"] Mar 21 06:04:00 crc kubenswrapper[4775]: E0321 06:04:00.139357 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="registry-server" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.139382 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="registry-server" Mar 21 06:04:00 crc kubenswrapper[4775]: E0321 06:04:00.139394 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="extract-utilities" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.139400 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="extract-utilities" Mar 21 06:04:00 crc kubenswrapper[4775]: E0321 06:04:00.139420 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="extract-content" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.139427 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="extract-content" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.139629 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="69163dff-f119-4580-8b11-b76aa49d1a3e" containerName="registry-server" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.140316 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-tsqsf" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.142460 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.143401 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.146455 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.149701 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567884-tsqsf"] Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.193822 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mctxv\" (UniqueName: \"kubernetes.io/projected/f42b0960-590a-4bb2-815e-6b4e25ce9dc3-kube-api-access-mctxv\") pod \"auto-csr-approver-29567884-tsqsf\" (UID: \"f42b0960-590a-4bb2-815e-6b4e25ce9dc3\") " pod="openshift-infra/auto-csr-approver-29567884-tsqsf" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.297783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mctxv\" (UniqueName: \"kubernetes.io/projected/f42b0960-590a-4bb2-815e-6b4e25ce9dc3-kube-api-access-mctxv\") pod \"auto-csr-approver-29567884-tsqsf\" (UID: \"f42b0960-590a-4bb2-815e-6b4e25ce9dc3\") " pod="openshift-infra/auto-csr-approver-29567884-tsqsf" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.319560 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mctxv\" (UniqueName: \"kubernetes.io/projected/f42b0960-590a-4bb2-815e-6b4e25ce9dc3-kube-api-access-mctxv\") pod \"auto-csr-approver-29567884-tsqsf\" (UID: \"f42b0960-590a-4bb2-815e-6b4e25ce9dc3\") " pod="openshift-infra/auto-csr-approver-29567884-tsqsf" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.461957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-tsqsf" Mar 21 06:04:00 crc kubenswrapper[4775]: I0321 06:04:00.965464 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567884-tsqsf"] Mar 21 06:04:00 crc kubenswrapper[4775]: W0321 06:04:00.967803 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf42b0960_590a_4bb2_815e_6b4e25ce9dc3.slice/crio-0bf0332673a1cf32952ca4af8bb8b08d38facf46b52d4741cb586d9c529963a8 WatchSource:0}: Error finding container 0bf0332673a1cf32952ca4af8bb8b08d38facf46b52d4741cb586d9c529963a8: Status 404 returned error can't find the container with id 0bf0332673a1cf32952ca4af8bb8b08d38facf46b52d4741cb586d9c529963a8 Mar 21 06:04:01 crc kubenswrapper[4775]: I0321 06:04:01.679174 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567884-tsqsf" event={"ID":"f42b0960-590a-4bb2-815e-6b4e25ce9dc3","Type":"ContainerStarted","Data":"0bf0332673a1cf32952ca4af8bb8b08d38facf46b52d4741cb586d9c529963a8"} Mar 21 06:04:02 crc kubenswrapper[4775]: I0321 06:04:02.688158 4775 generic.go:334] "Generic (PLEG): container finished" podID="f42b0960-590a-4bb2-815e-6b4e25ce9dc3" containerID="841f733e77f2e32ed72910d2f3507d038abd01a9d5957766d5755d8922550f30" exitCode=0 Mar 21 06:04:02 crc kubenswrapper[4775]: I0321 06:04:02.688206 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567884-tsqsf" event={"ID":"f42b0960-590a-4bb2-815e-6b4e25ce9dc3","Type":"ContainerDied","Data":"841f733e77f2e32ed72910d2f3507d038abd01a9d5957766d5755d8922550f30"} Mar 21 06:04:04 crc kubenswrapper[4775]: I0321 06:04:04.079676 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-tsqsf" Mar 21 06:04:04 crc kubenswrapper[4775]: I0321 06:04:04.179862 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mctxv\" (UniqueName: \"kubernetes.io/projected/f42b0960-590a-4bb2-815e-6b4e25ce9dc3-kube-api-access-mctxv\") pod \"f42b0960-590a-4bb2-815e-6b4e25ce9dc3\" (UID: \"f42b0960-590a-4bb2-815e-6b4e25ce9dc3\") " Mar 21 06:04:04 crc kubenswrapper[4775]: I0321 06:04:04.186001 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42b0960-590a-4bb2-815e-6b4e25ce9dc3-kube-api-access-mctxv" (OuterVolumeSpecName: "kube-api-access-mctxv") pod "f42b0960-590a-4bb2-815e-6b4e25ce9dc3" (UID: "f42b0960-590a-4bb2-815e-6b4e25ce9dc3"). InnerVolumeSpecName "kube-api-access-mctxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:04:04 crc kubenswrapper[4775]: I0321 06:04:04.282167 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mctxv\" (UniqueName: \"kubernetes.io/projected/f42b0960-590a-4bb2-815e-6b4e25ce9dc3-kube-api-access-mctxv\") on node \"crc\" DevicePath \"\"" Mar 21 06:04:04 crc kubenswrapper[4775]: I0321 06:04:04.708650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567884-tsqsf" event={"ID":"f42b0960-590a-4bb2-815e-6b4e25ce9dc3","Type":"ContainerDied","Data":"0bf0332673a1cf32952ca4af8bb8b08d38facf46b52d4741cb586d9c529963a8"} Mar 21 06:04:04 crc kubenswrapper[4775]: I0321 06:04:04.708983 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bf0332673a1cf32952ca4af8bb8b08d38facf46b52d4741cb586d9c529963a8" Mar 21 06:04:04 crc kubenswrapper[4775]: I0321 06:04:04.708715 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567884-tsqsf" Mar 21 06:04:05 crc kubenswrapper[4775]: I0321 06:04:05.147109 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-jj8nx"] Mar 21 06:04:05 crc kubenswrapper[4775]: I0321 06:04:05.158802 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567878-jj8nx"] Mar 21 06:04:05 crc kubenswrapper[4775]: I0321 06:04:05.676667 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad9f411-868c-4132-a586-c0f66e18edbc" path="/var/lib/kubelet/pods/7ad9f411-868c-4132-a586-c0f66e18edbc/volumes" Mar 21 06:04:11 crc kubenswrapper[4775]: I0321 06:04:11.662100 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:04:11 crc kubenswrapper[4775]: E0321 06:04:11.663913 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:04:25 crc kubenswrapper[4775]: I0321 06:04:25.662150 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:04:25 crc kubenswrapper[4775]: E0321 06:04:25.663196 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:04:39 crc kubenswrapper[4775]: I0321 06:04:39.661853 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:04:39 crc kubenswrapper[4775]: E0321 06:04:39.662832 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:04:50 crc kubenswrapper[4775]: I0321 06:04:50.722274 4775 scope.go:117] "RemoveContainer" containerID="a70b4f1d1db681654636f9ae7783b422ab2ef1d53af567edd94ba3f694c4a39b" Mar 21 06:04:53 crc kubenswrapper[4775]: I0321 06:04:53.662319 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:04:53 crc kubenswrapper[4775]: E0321 06:04:53.663491 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:05:04 crc kubenswrapper[4775]: I0321 06:05:04.662345 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:05:04 crc kubenswrapper[4775]: E0321 06:05:04.663227 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:05:15 crc kubenswrapper[4775]: I0321 06:05:15.661238 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:05:15 crc kubenswrapper[4775]: E0321 06:05:15.661994 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:05:26 crc kubenswrapper[4775]: I0321 06:05:26.660886 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:05:26 crc kubenswrapper[4775]: E0321 06:05:26.661722 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qc7hn_openshift-machine-config-operator(cffcf487-ef41-4395-81eb-e5e6358f4a32)\"" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" podUID="cffcf487-ef41-4395-81eb-e5e6358f4a32" Mar 21 06:05:37 crc kubenswrapper[4775]: I0321 06:05:37.668315 4775 scope.go:117] "RemoveContainer" containerID="9a46f9d97e2c8b6a15fe9a04ccabbff9c60ceb48870c3b95f86f360ff5688226" Mar 21 06:05:38 crc kubenswrapper[4775]: I0321 06:05:38.572345 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qc7hn" event={"ID":"cffcf487-ef41-4395-81eb-e5e6358f4a32","Type":"ContainerStarted","Data":"9024ebf98588943f86acd69f0f23316451ade4ad0341403c68225759f21c46c1"} Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.229486 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8zwmh"] Mar 21 06:05:47 crc kubenswrapper[4775]: E0321 06:05:47.232901 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42b0960-590a-4bb2-815e-6b4e25ce9dc3" containerName="oc" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.233156 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42b0960-590a-4bb2-815e-6b4e25ce9dc3" containerName="oc" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.233649 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42b0960-590a-4bb2-815e-6b4e25ce9dc3" containerName="oc" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.236165 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.253191 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8zwmh"] Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.345865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc56q\" (UniqueName: \"kubernetes.io/projected/9abf3c52-1211-4152-9055-2f7fab20a05c-kube-api-access-bc56q\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.346099 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-catalog-content\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.346177 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-utilities\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.448317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc56q\" (UniqueName: \"kubernetes.io/projected/9abf3c52-1211-4152-9055-2f7fab20a05c-kube-api-access-bc56q\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.448418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-catalog-content\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.448465 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-utilities\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.448999 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-catalog-content\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.449093 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-utilities\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.467785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc56q\" (UniqueName: \"kubernetes.io/projected/9abf3c52-1211-4152-9055-2f7fab20a05c-kube-api-access-bc56q\") pod \"certified-operators-8zwmh\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:47 crc kubenswrapper[4775]: I0321 06:05:47.568231 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:48 crc kubenswrapper[4775]: I0321 06:05:48.105407 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8zwmh"] Mar 21 06:05:48 crc kubenswrapper[4775]: W0321 06:05:48.116263 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9abf3c52_1211_4152_9055_2f7fab20a05c.slice/crio-9510d881753680b23173127d27b1c5a3df1e24eb0470c5787f57c2dbe06568c7 WatchSource:0}: Error finding container 9510d881753680b23173127d27b1c5a3df1e24eb0470c5787f57c2dbe06568c7: Status 404 returned error can't find the container with id 9510d881753680b23173127d27b1c5a3df1e24eb0470c5787f57c2dbe06568c7 Mar 21 06:05:48 crc kubenswrapper[4775]: I0321 06:05:48.702048 4775 generic.go:334] "Generic (PLEG): container finished" podID="9abf3c52-1211-4152-9055-2f7fab20a05c" containerID="534acc5a77390359561da3780439f94a536281e282223833950dbd423176d301" exitCode=0 Mar 21 06:05:48 crc kubenswrapper[4775]: I0321 06:05:48.702099 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zwmh" event={"ID":"9abf3c52-1211-4152-9055-2f7fab20a05c","Type":"ContainerDied","Data":"534acc5a77390359561da3780439f94a536281e282223833950dbd423176d301"} Mar 21 06:05:48 crc kubenswrapper[4775]: I0321 06:05:48.702154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zwmh" event={"ID":"9abf3c52-1211-4152-9055-2f7fab20a05c","Type":"ContainerStarted","Data":"9510d881753680b23173127d27b1c5a3df1e24eb0470c5787f57c2dbe06568c7"} Mar 21 06:05:49 crc kubenswrapper[4775]: I0321 06:05:49.715064 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zwmh" event={"ID":"9abf3c52-1211-4152-9055-2f7fab20a05c","Type":"ContainerStarted","Data":"4dd9bdd9452b82ca269c65787b108c5f00b05f91291a136b5fe6abe042914d85"} Mar 21 06:05:50 crc kubenswrapper[4775]: I0321 06:05:50.726374 4775 generic.go:334] "Generic (PLEG): container finished" podID="9abf3c52-1211-4152-9055-2f7fab20a05c" containerID="4dd9bdd9452b82ca269c65787b108c5f00b05f91291a136b5fe6abe042914d85" exitCode=0 Mar 21 06:05:50 crc kubenswrapper[4775]: I0321 06:05:50.726446 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zwmh" event={"ID":"9abf3c52-1211-4152-9055-2f7fab20a05c","Type":"ContainerDied","Data":"4dd9bdd9452b82ca269c65787b108c5f00b05f91291a136b5fe6abe042914d85"} Mar 21 06:05:51 crc kubenswrapper[4775]: I0321 06:05:51.739932 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zwmh" event={"ID":"9abf3c52-1211-4152-9055-2f7fab20a05c","Type":"ContainerStarted","Data":"6f21776a69e4b4c37940f886b036f5f88e915ca03c7c397a3252b31ac6596726"} Mar 21 06:05:51 crc kubenswrapper[4775]: I0321 06:05:51.759489 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8zwmh" podStartSLOduration=2.227601785 podStartE2EDuration="4.759470927s" podCreationTimestamp="2026-03-21 06:05:47 +0000 UTC" firstStartedPulling="2026-03-21 06:05:48.704837266 +0000 UTC m=+4701.681300890" lastFinishedPulling="2026-03-21 06:05:51.236706408 +0000 UTC m=+4704.213170032" observedRunningTime="2026-03-21 06:05:51.757054299 +0000 UTC m=+4704.733517943" watchObservedRunningTime="2026-03-21 06:05:51.759470927 +0000 UTC m=+4704.735934541" Mar 21 06:05:57 crc kubenswrapper[4775]: I0321 06:05:57.568832 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:57 crc kubenswrapper[4775]: I0321 06:05:57.569476 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:57 crc kubenswrapper[4775]: I0321 06:05:57.625619 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:58 crc kubenswrapper[4775]: I0321 06:05:58.416674 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:05:58 crc kubenswrapper[4775]: I0321 06:05:58.474718 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8zwmh"] Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.159743 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567886-pl2fv"] Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.162582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-pl2fv" Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.164961 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.165152 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.165274 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-299gm" Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.174932 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9wl\" (UniqueName: \"kubernetes.io/projected/1ea4f899-dfac-4aa0-98be-f676c433fea5-kube-api-access-kb9wl\") pod \"auto-csr-approver-29567886-pl2fv\" (UID: \"1ea4f899-dfac-4aa0-98be-f676c433fea5\") " pod="openshift-infra/auto-csr-approver-29567886-pl2fv" Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.184444 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567886-pl2fv"] Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.275724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9wl\" (UniqueName: \"kubernetes.io/projected/1ea4f899-dfac-4aa0-98be-f676c433fea5-kube-api-access-kb9wl\") pod \"auto-csr-approver-29567886-pl2fv\" (UID: \"1ea4f899-dfac-4aa0-98be-f676c433fea5\") " pod="openshift-infra/auto-csr-approver-29567886-pl2fv" Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.294171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9wl\" (UniqueName: \"kubernetes.io/projected/1ea4f899-dfac-4aa0-98be-f676c433fea5-kube-api-access-kb9wl\") pod \"auto-csr-approver-29567886-pl2fv\" (UID: \"1ea4f899-dfac-4aa0-98be-f676c433fea5\") " pod="openshift-infra/auto-csr-approver-29567886-pl2fv" Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.377672 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8zwmh" podUID="9abf3c52-1211-4152-9055-2f7fab20a05c" containerName="registry-server" containerID="cri-o://6f21776a69e4b4c37940f886b036f5f88e915ca03c7c397a3252b31ac6596726" gracePeriod=2 Mar 21 06:06:00 crc kubenswrapper[4775]: I0321 06:06:00.503681 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-pl2fv" Mar 21 06:06:01 crc kubenswrapper[4775]: I0321 06:06:01.007673 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567886-pl2fv"] Mar 21 06:06:01 crc kubenswrapper[4775]: I0321 06:06:01.387360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567886-pl2fv" event={"ID":"1ea4f899-dfac-4aa0-98be-f676c433fea5","Type":"ContainerStarted","Data":"24a84203961e192c5ab668a37933451cf3776227a8956a53eabc40295ee336f4"} Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.412012 4775 generic.go:334] "Generic (PLEG): container finished" podID="9abf3c52-1211-4152-9055-2f7fab20a05c" containerID="6f21776a69e4b4c37940f886b036f5f88e915ca03c7c397a3252b31ac6596726" exitCode=0 Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.412417 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zwmh" event={"ID":"9abf3c52-1211-4152-9055-2f7fab20a05c","Type":"ContainerDied","Data":"6f21776a69e4b4c37940f886b036f5f88e915ca03c7c397a3252b31ac6596726"} Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.692164 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.707007 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc56q\" (UniqueName: \"kubernetes.io/projected/9abf3c52-1211-4152-9055-2f7fab20a05c-kube-api-access-bc56q\") pod \"9abf3c52-1211-4152-9055-2f7fab20a05c\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.707241 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-utilities\") pod \"9abf3c52-1211-4152-9055-2f7fab20a05c\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.707345 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-catalog-content\") pod \"9abf3c52-1211-4152-9055-2f7fab20a05c\" (UID: \"9abf3c52-1211-4152-9055-2f7fab20a05c\") " Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.708380 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-utilities" (OuterVolumeSpecName: "utilities") pod "9abf3c52-1211-4152-9055-2f7fab20a05c" (UID: "9abf3c52-1211-4152-9055-2f7fab20a05c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.723613 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9abf3c52-1211-4152-9055-2f7fab20a05c-kube-api-access-bc56q" (OuterVolumeSpecName: "kube-api-access-bc56q") pod "9abf3c52-1211-4152-9055-2f7fab20a05c" (UID: "9abf3c52-1211-4152-9055-2f7fab20a05c"). InnerVolumeSpecName "kube-api-access-bc56q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.783589 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9abf3c52-1211-4152-9055-2f7fab20a05c" (UID: "9abf3c52-1211-4152-9055-2f7fab20a05c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.810199 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.810233 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc56q\" (UniqueName: \"kubernetes.io/projected/9abf3c52-1211-4152-9055-2f7fab20a05c-kube-api-access-bc56q\") on node \"crc\" DevicePath \"\"" Mar 21 06:06:03 crc kubenswrapper[4775]: I0321 06:06:03.810244 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9abf3c52-1211-4152-9055-2f7fab20a05c-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.423532 4775 generic.go:334] "Generic (PLEG): container finished" podID="1ea4f899-dfac-4aa0-98be-f676c433fea5" containerID="cd3492d0ce0e47fc31fe01e0df3071ac761f4c482390de6fdf488d608d7c7ca3" exitCode=0 Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.423630 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567886-pl2fv" event={"ID":"1ea4f899-dfac-4aa0-98be-f676c433fea5","Type":"ContainerDied","Data":"cd3492d0ce0e47fc31fe01e0df3071ac761f4c482390de6fdf488d608d7c7ca3"} Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.426868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zwmh" event={"ID":"9abf3c52-1211-4152-9055-2f7fab20a05c","Type":"ContainerDied","Data":"9510d881753680b23173127d27b1c5a3df1e24eb0470c5787f57c2dbe06568c7"} Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.426935 4775 scope.go:117] "RemoveContainer" containerID="6f21776a69e4b4c37940f886b036f5f88e915ca03c7c397a3252b31ac6596726" Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.427013 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zwmh" Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.455601 4775 scope.go:117] "RemoveContainer" containerID="4dd9bdd9452b82ca269c65787b108c5f00b05f91291a136b5fe6abe042914d85" Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.487895 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8zwmh"] Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.495442 4775 scope.go:117] "RemoveContainer" containerID="534acc5a77390359561da3780439f94a536281e282223833950dbd423176d301" Mar 21 06:06:04 crc kubenswrapper[4775]: I0321 06:06:04.497485 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8zwmh"] Mar 21 06:06:05 crc kubenswrapper[4775]: I0321 06:06:05.674243 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9abf3c52-1211-4152-9055-2f7fab20a05c" path="/var/lib/kubelet/pods/9abf3c52-1211-4152-9055-2f7fab20a05c/volumes" Mar 21 06:06:06 crc kubenswrapper[4775]: I0321 06:06:06.094008 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-pl2fv" Mar 21 06:06:06 crc kubenswrapper[4775]: I0321 06:06:06.257599 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb9wl\" (UniqueName: \"kubernetes.io/projected/1ea4f899-dfac-4aa0-98be-f676c433fea5-kube-api-access-kb9wl\") pod \"1ea4f899-dfac-4aa0-98be-f676c433fea5\" (UID: \"1ea4f899-dfac-4aa0-98be-f676c433fea5\") " Mar 21 06:06:06 crc kubenswrapper[4775]: I0321 06:06:06.264471 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea4f899-dfac-4aa0-98be-f676c433fea5-kube-api-access-kb9wl" (OuterVolumeSpecName: "kube-api-access-kb9wl") pod "1ea4f899-dfac-4aa0-98be-f676c433fea5" (UID: "1ea4f899-dfac-4aa0-98be-f676c433fea5"). InnerVolumeSpecName "kube-api-access-kb9wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 06:06:06 crc kubenswrapper[4775]: I0321 06:06:06.360034 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb9wl\" (UniqueName: \"kubernetes.io/projected/1ea4f899-dfac-4aa0-98be-f676c433fea5-kube-api-access-kb9wl\") on node \"crc\" DevicePath \"\"" Mar 21 06:06:06 crc kubenswrapper[4775]: I0321 06:06:06.450300 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567886-pl2fv" event={"ID":"1ea4f899-dfac-4aa0-98be-f676c433fea5","Type":"ContainerDied","Data":"24a84203961e192c5ab668a37933451cf3776227a8956a53eabc40295ee336f4"} Mar 21 06:06:06 crc kubenswrapper[4775]: I0321 06:06:06.450349 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567886-pl2fv" Mar 21 06:06:06 crc kubenswrapper[4775]: I0321 06:06:06.450353 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24a84203961e192c5ab668a37933451cf3776227a8956a53eabc40295ee336f4" Mar 21 06:06:07 crc kubenswrapper[4775]: I0321 06:06:07.164526 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-mnpzk"] Mar 21 06:06:07 crc kubenswrapper[4775]: I0321 06:06:07.174018 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567880-mnpzk"] Mar 21 06:06:07 crc kubenswrapper[4775]: I0321 06:06:07.674070 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b58398-7316-48e6-91d4-184a3600d431" path="/var/lib/kubelet/pods/d8b58398-7316-48e6-91d4-184a3600d431/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157432566024463 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157432567017401 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157421035016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157421035015460 5ustar corecore